This seems like it's missing the forest for the trees. The point of security measures is to make executing the attack more expensive than the expected payoff of successfully executing it.
What is the payoff here? Is the projector sold below cost and is the manufacturer recouping that via the cartridges? If not, what's the loss to them?
Regarding the proposed mitigations, I'm very doubtful on whether they would substantially change anything here:
> Use real crypto (AES-128 or lightweight stream) and make the cartridge carry per-title key (or an IV)
> Copying now requires cloning/extracting the original token secrets.
Sounds like a great idea, and fortunately we don't even need to speculate about whether it would work: Nintendo did this with Amiibo.
> If true anti-cloning matters, this requires an authenticated token (DESFire / NTAG 424 DNA class).
And where do you securely store the validation key for a symmetric encryption/authentication scheme? This would require adding a SAM to the projector as well.
The "use non-default NFC keys" suggestion shares the same problem: Where would you securely store these?
Actually, I think your point of view isn't that far off from what the article suggests. The goal shouldn't be to stop a state actor or a reverse engineering expert, but simply to meet basic business requirements at the same cost.
It's more about risk management, like raising the bar high enough so that the revenue model isn't affected by a bored casual user with a free Android app.
That said, your point is correct, it's difficult to make a robust DRM (it has taken industry giants quite some time to come up with models that remain “secure” for a certain amount of time)... but we are talking about a cheap toy, in which I don't think anyone would invest much more than a few hours trying to breach it.
TFA was pretty clear that this is an example which illustrates common issues in enterprise security. It even provides a handy table to map the similar patterns between this toy and a network appliance. No one’s arguing for stronger security in children’s toys, here.
Exactly it's very junior mindset from the article author.
Where without giving consideration to the situation they are espousing "best practices". Best practice for what? A children's toy DRM for NFC tag? Come on....
One of the fun things about the World Wide Web is that without specifically intending to do so we provided all of the Worst Case Scenario cryptographic properties, things a good cryptographic solution can cope with, but which were often treated as difficulties that aren't really worth worrying about because why would you need that?
For example, what kind of moron would put a secret you mustn't learn right next to data you can choose? A good solution wouldn't care, but surely a bad solution where that would cause a problem would never encounter real world scenarios where.... oh right HTTP Cookies
Good solutions won't lose security from repeating transactions, but while accidents might cause one or two repetitions surely no real world systems would need to withstand millions of... oh yeah, Javascript loops exist
At this point, I think that any good undergrad computer engineering education should include a class on practical security patterns, and design for security. Or, at the very least, training on when you need proactively call on a developer with better chops.
It would save the world so, so much grief and cheap insecure consumer devices. I will flip my lid if I see another kiddy-cam on Shodan.
Security has certain cost associated to implement it. That makes product more expensive without any additional market value. There must be certain external incentives to motivate spending extra effort
A bit absurd really, the image of the manufacturer locking this down with robust security signed payloads and bootloaders is truly comical.
Unpopular opinion here: but this article is perfect proof of concept that when trying to take something to market you need a non technical person put the brakes on some technical teams.
I don't think the the conclusion is right. It's just that the security had cost money, why pay a developer for 5 days when he can do it in 3 without proper security? There is no proper security needed, so don't pay for it. And thats exactly the same that happens with bigger software too. As long as it doesn't creates pain for the seller to sell insecure tools, they will stay insecure.
This seems like it's missing the forest for the trees. The point of security measures is to make executing the attack more expensive than the expected payoff of successfully executing it.
What is the payoff here? Is the projector sold below cost and is the manufacturer recouping that via the cartridges? If not, what's the loss to them?
Regarding the proposed mitigations, I'm very doubtful on whether they would substantially change anything here:
> Use real crypto (AES-128 or lightweight stream) and make the cartridge carry per-title key (or an IV)
> Copying now requires cloning/extracting the original token secrets.
Sounds like a great idea, and fortunately we don't even need to speculate about whether it would work: Nintendo did this with Amiibo.
> If true anti-cloning matters, this requires an authenticated token (DESFire / NTAG 424 DNA class).
And where do you securely store the validation key for a symmetric encryption/authentication scheme? This would require adding a SAM to the projector as well.
The "use non-default NFC keys" suggestion shares the same problem: Where would you securely store these?
Actually, I think your point of view isn't that far off from what the article suggests. The goal shouldn't be to stop a state actor or a reverse engineering expert, but simply to meet basic business requirements at the same cost.
It's more about risk management, like raising the bar high enough so that the revenue model isn't affected by a bored casual user with a free Android app.
That said, your point is correct, it's difficult to make a robust DRM (it has taken industry giants quite some time to come up with models that remain “secure” for a certain amount of time)... but we are talking about a cheap toy, in which I don't think anyone would invest much more than a few hours trying to breach it.
TFA was pretty clear that this is an example which illustrates common issues in enterprise security. It even provides a handy table to map the similar patterns between this toy and a network appliance. No one’s arguing for stronger security in children’s toys, here.
Exactly it's very junior mindset from the article author.
Where without giving consideration to the situation they are espousing "best practices". Best practice for what? A children's toy DRM for NFC tag? Come on....
One of the fun things about the World Wide Web is that without specifically intending to do so we provided all of the Worst Case Scenario cryptographic properties, things a good cryptographic solution can cope with, but which were often treated as difficulties that aren't really worth worrying about because why would you need that?
For example, what kind of moron would put a secret you mustn't learn right next to data you can choose? A good solution wouldn't care, but surely a bad solution where that would cause a problem would never encounter real world scenarios where.... oh right HTTP Cookies
Good solutions won't lose security from repeating transactions, but while accidents might cause one or two repetitions surely no real world systems would need to withstand millions of... oh yeah, Javascript loops exist
At this point, I think that any good undergrad computer engineering education should include a class on practical security patterns, and design for security. Or, at the very least, training on when you need proactively call on a developer with better chops.
It would save the world so, so much grief and cheap insecure consumer devices. I will flip my lid if I see another kiddy-cam on Shodan.
Security has certain cost associated to implement it. That makes product more expensive without any additional market value. There must be certain external incentives to motivate spending extra effort
This is almost a feature, it allows people who are more curious to unlock without buying more cartridges
A bit absurd really, the image of the manufacturer locking this down with robust security signed payloads and bootloaders is truly comical.
Unpopular opinion here: but this article is perfect proof of concept that when trying to take something to market you need a non technical person put the brakes on some technical teams.
I don't think the the conclusion is right. It's just that the security had cost money, why pay a developer for 5 days when he can do it in 3 without proper security? There is no proper security needed, so don't pay for it. And thats exactly the same that happens with bigger software too. As long as it doesn't creates pain for the seller to sell insecure tools, they will stay insecure.