The more interesting hack was announced at 27c3. A team comprised of Wii hackers has discovered Sony's main boot-signing private key. This is like discovering Verisign's private key -- you can now issue any SSL cert you want. They can sign any hypervisor they want, which leads to running any code you want.
They were able to do this because (surprise), there was a crypto mistake in the implementation. Two (or more) ECDSA signatures were generated with the same secret nonce. Apparently Sony doesn't read our blog because we discussed this flaw before:
And before that, we discussed a variant of this attack when the Debian PRNG was broken:
The cool thing about this flaw is that the private key is not present in the PS3 anywhere. It (probably) only exists at some locked down code-signing center. However, a software flaw in the way it generated the signatures was effectively painting the private key on the side of every signed code module released.
And people still think crypto isn't dangerous?
One point I have not said recently is that crypto review is very expensive in terms of time and money. So the design approach to security problems should be roughly:
1. Avoid crypto if possible. Store data on the server, for example. Doing this correctly is orders of magnitude easier than developing crypto protocols.
2. If using crypto, use something high-level. GPG is a great example of a bundle of crypto primitives with a well-understood protocol for encryption, integrity protection, and key management. PGP has been around for 20 years now.
3. If none of the above works, develop custom crypto protocol. But budget 10x for review as for design/implementation. So if you spend a week and $10,000 developing it, spend 10 weeks and $100,000 to review/improve it. This includes external review, not just internal. This goes for everyone, even "experts".
My main point is: "Crypto costs a lot. Are you sure you want to pay for it?" Because if you do implement custom crypto, you (or your users) will pay for it one way or another.
Part of the problem is that most people who have to use it aren't mathematicians. And even those of us who have math degrees may not realize every single one of the assumptions or requirements that go into each operation (especially if they change due to new attacks being discovered).
So just knowing that something is dangerous isn't the same as having everything you need to deal with the danger.
I'm surprised that I've never heard of a cryptography wiki somewhere where people try to list all the requirements for using each algorithm or technique securely in simple terms. E.G. a page on each technique or algorithm telling you which numbers must be random, unpredictable, never repeated, etc. Or telling you that if a person has these numbers, they can do X, Y & Z. Or that if you don't verify that this padding is right, people can forge messages. With citations linking to the attacks.
Incidentally, I'm hoping that I'm wrong and there really is such a thing out there, somewhere.
It's also all downside. No one person has all these details. The primary contributor to such a wiki is inevitably going to miss horrible faults. If you're that guy, why bother?
Even those $500/hr experts miss horrible faults from what I can see. I mean, you were telling me several months ago on HN how hypervisor secured game consoles were one of the few places where we see high-end security in the consumer space and that we can't just assume that the hackers will always win.
But the way I see it, there aren't any magic bullets in the security world. Security is a Red Queen problem. In other words, you have to keep running as hard as you can just to avoid falling behind. If you stop running, someone will eventually catch up to you.
> No one person has all these details.
That sounds like an incentive to collaborate to me. One might expect that people should want to gather and organize important details like that. The fact that any such an endeavor would inevitably be incomplete and require updates should go without saying.
The main problems would be starting with enough information for people to want to contribute to it and having moderators who were good enough to vet contributions for accuracy.
I don't think the thing you want to have happen is going to happen.
I think what could happen is, for lack of a better term, a half-assed wiki that leads people to code broken e=3 RSA implementations and feel safe doing it.
On the blog you write: "In DSA, the k value is not a nonce."
Is it any wonder people get confused?
"In DSA, the k value is not a nonce. In addition to being
unique, the value must be unpredictable and secret. This
makes it more like a random session key than a nonce. When
an implementer gets this wrong, they expose the private
key, often with only one or two signatures."
I take great pains to create a reasonable term where none exists because I agree terminology and consistency are important in crypto. If you have a better term, I'm happy to hear it.
I liked your comment in general, but I think the jab at Sony for not reading your blog and the parenthetical (surprise) insinuate a level of boneheadedness that's unwarranted. Maybe that's just my reading of it.
You need three concepts: unique (used only once, ever), unpredictable (pseudo-random), and secret (never revealed to anyone, before or after use).
You should read the DSA spec, FIPS 186-3, section 4.5. They call this parameter the "Per-Message Secret Number", which doesn't capture the fact it needs to be unpredictable. (Later in that section, they mention it is "random", but the name of the parameter doesn't have that notion.)
My surprise is not that Sony didn't read our blog, but that they didn't read FIPS 186 when implementing their concrete-vault root signing tool. "Per-message" is spelled out right there in the title of section 4.5.
"Two (or more) ECDSA signatures were generated with a constant value provided where a secret nonce value should go, making it not effectively a nonce."
What is the dongle ID exactly?
Essentially it's plugged in and the PS3 started up. The PS3 communicates with the dongle and swaps a set of 'secure' keys to authenticate that the dongle is a legitimate, then runs some code to give you access to all sorts of options you normally wouldn't be able to see/use.
What these guys have done is found the master key found in all PS3s that allows it to authenticate any/all service dongles. Using this information one can generate their own service ID and ultimately create their own dongle.
Basically this was possible to do because the protection mechanisms in place to protect the key relied on the rest of the system not being broken. Once the system was hacked it was simply a matter of time before this was decoded as well.
From my own experience, service mode on a lot of embedded devices typically exposes some diagnostics maybe let's you load some things you couldn't otherwise load, I assume it's the same on the PS3. I also assume this doesn't compromise Sony's ability to sign software or allow third parties to sign software, however in service mode you might not need "signed" software.
Heck, there are off the shelf solutions for this stuff, there are chips you can load a set of keys in to at manufacturing time and they contain all the crypto in the chip such that there is close to no way it could "leak" out. I'd assume IBM, Toshiba and Sony would use something like that and if they properly generate keys the only real way the "master key" could escape would be a rogue employee leaking it. They knew people would attack the platform.
Yep you're spot-on in this case, and as you say the software signing keys hasn't been compromised, though they aren't needed in service mode.
There are definitely ways the dongle keys could have been better protected (and I'm sure a few people are having some very serious talks about why they weren't), but have to give Sony kudos for having a system last 3 years without being compromised, and even now it's only easily broken at ring 2; the gameOS level of the system.
For example AWS uses HMACs for signatures, what kind of scheme would you propose?
A digital signature uses public key cryptography: one person can generate and sign messages, everyone else can verify. A common one is the DSA -- but there are many others, including freakishly fast elliptical curve versions.
I agree, though, that they have been a little sloppy with the terminology. Some of their APIs refer to "Signature"s, when they should say "Authentication Code"s. I think, though, that this may have led to even more customer confusion than the current situation, as most people aren't aware of the difference between HMACs and signatures.
import sys, hashlib, hmac
print hmac.new("46DCEAD317FE45D80923EB97E4956410D4CDB2C2".decode("hex"), sys.argv.decode("hex"), hashlib.sha1).hexdigest()
# suitable for a t-shirt or something.
- It's the USB Jig Master Key
- It allows any currently available PS3 to be put into service mode.
- It allows users of 3.50 or below to downgrade.
- It doesn't allow 3.55 users to downgrade (currently, though they can still access service mode).
This isn't the PS3 master key just as no one knows the PSP master key
This will enable any user of firmware 3.5 or lower on the current hardware to run backup games that don't require a firmware of greater than 3.5, pretty much the same story as PSP modding.
I don't really know much about PS3 hacking, so this is all just a guess.
This is how original Xbox and Xbox 360 softmods work. The regular OS boots and either an exploit occurs in the font package, music player, MechAssault or Splinter Cell (the latter two used for bootstrapping the former two) or in the 360 the xbox is "rebooted" virtually to avoid the verified boot into an alternative kernel that has signature checking removed.
Note, I know nothing about PS3 hacking either and am making assumptions based on the connotations of the word "master key", "signing" and other comments here along with my knowledge of xbox1/360 hacking.
It's not the master key for cryptographically signing executables or OS images.
I'll post more info once I've seen what they say.
EDIT: WOW! Okay, looks like they screwed up big time. They used the same random number for all their signatures, which means that they effectively leaked their private key for various bootloaders in the system. The chain of trust is toast.
The master keys for signing PS3 executables has not been leaked.
It looks like they use simple dongles for entering service mode. These dongles are authenticated by an HMAC rather than public-key crypto (big mistake on Sony's part).
My PS3 is collecting dust because the AppleTV is a better media player and a cheap PC with Steam installed is cheaper than a PS3 and just a few $70 game discs.