" To sign a message, encrypt it with the private key
To verify a signature, decrypt it with the public key and make sure it matches the message"
This is only half true for one algorithm: RSA. But even there it's only true for something called "Textbook RSA", which is an insecure variant of RSA nobody should be using. It's not true for any real algorithm.
I'm really not a fan of such sloppy "I want to explain it easy" crypto introductions that are simply not correct.
(Also found it odd that he uses "Verisign" as an example for a CA. Verisign has been bought by Symantec and Symantec was distrusted by browsers recently, so it's as dead as it can be.)
There’s a similar problem in physics - schools still teach Newton’s laws, even though they are wrong - because they are a sufficient approximation for many uses.
The problem is of course when people assume what they’ve learnt at that early level is sufficient to work with at a level that is above their knowledge - but I’m not sure what the solution to that is.
You can teach people Newton's mechanics and say "this is a good approximation with a marginal error for most everyday examples, the correct way of calculating it involves very complex things."
I feel the example I quoted regarding signatures is something that's not really a useful information anyway. That the RSA function works both ways for signatures and encryption is more of a fluke and not really someting you need to tell people when you explain the basics of public key crypto.
In the abstract 'encrypt with the private key' is a totally meaningless sentence for assymetric encryption. The entire point of a public key is decryption not encryption.
I do however believe that textbook RSA signing is secure in the simplest model. It is incredibly malleable but (especially when modeling the hash as a PRF) prevents forged signatures. In that sense I'd equate calling it secure to newtonian mechanics without friction and with perfect elasticity. That is, it forms a simple teaching model, and can inform an intuition on how things work. However, no-one should build things based on the model and expect it to come out correctly.
For most of other asymetric algorithms the primitive operation is DH-style key agreement function and the derived encryption and signature constructs are significantly more involved and in fact there isn't that much of an symetry between them. (and also the plain asymetric encryption operation gets somewhat pointless)
I used that wording since most people can easily visualize the back-and-forth of encryption/decryption. Your point makes sense; it's unnecessarily confusing and seems to suggest the mechanisms behind encryption and signatures are the same.
I've edited that section to better describe what happens in practice.
Remember RSA is just very simple maths, done with huge numbers. If you pick the right "huge" numbers things that look hard become very easy indeed. So we need to ensure we never pick them.
The correct way to do this, which a lot of systems haven't adopted yet, is called RSA-PSS, the Probabilistic Signature Scheme. PSS has a proof that says if you believe RSA works, and assume certain other reasonable things, this is actually safe.
Before RSA-PSS (and still today in lots of backwards compatible systems), people used PKCS#1 v1.5 which has a scheme somebody threw together to do some padding but without any great insight. There is no security proof for PKCS#1 v1.5, it's probably safe, ish, but we can't be sure.
As long as you avoid the known problems, it probably is safe for signatures and the main problem now is that PSS is not included in lots of standards which thus require PKCS1_1v5. This prevents major implementation due to those standards not being updated fast enough.
As an example of slow adoption: The HSMs i'm currently using only started native support for PSS last year, about 20 years after it's introduction.
Please note that pkcs1_1v5 is never secure for encryption/decryption schemes.
More typically to sign a message, RSA is used to sign a hash of the message.
If you have (international) standards to adhere to, you are out of luck most of the time since they specify the exact schemes and cryptography required to adhere to the standard. Adoption of encryption/signature schemes is slow at best unfortunately.
If you do not; go wild. If you like big keys, get some 'quantum proof' public cryptography while you are at it.
What do you suggest here, and will it work with X.509 certificates?
For the Web PKI, the Baseline Requirements currently permit NIST P-256, P-384, or P-521 [sic] for "Elliptic curve" public key signatures, so that would let you do this for "SSL certificates" and plenty of people do but it's not compatible with older software, so if you care about that you need to have a plan B.
Depending your exact browser etcetera, if you go to google.com the certificate you're sent will be one of their P-256 certificates and your browser will verify both that this cert is genuine, and that the server can prove it knows the corresponding private key, using elliptic curve cryptography rather than RSA.
However for anyone who doesn't have the requisite background it will just be abstract and impenetrable.
If it tried to explain the full process you’re now diving into the explanations of GPG and buying SSL certs, etc and that feels like their eyes would simply glaze over.
I would like to see a tad more separation between the types of PKI (specifically GPG vs TLS type clarity).
I've never really been able to get my head around things like when and how a certificate also contains a private key (and when that private key is "exportable"), what all the "key usages" are and where they get their strange identifiers from (and how those things are actually enforced and used), etc.
EDIT: I should have kept reading, the linked Wikipedia articles on X.509, ASN.1, DER and PEM in the next section appear to be about what I'm looking for. I suppose I've just always been confused by all the formality and the number of specifications about different things.
There is a file format called PKCS #12 which lets you bundle things together. It is very common, particularly on Windows systems, to see this format used with files (often names ending .PFX) to bundle together a _certificate_ and a _private key_. This is presumably "convenient" in some sense, but since the security requirements for these two things are utterly different (a certificate is a public document that can be shown to anyone, a private key must not be seen by anybody else) it's a hazard in practice.
It's especially problematic that the resulting bundle (which you mustn't show to anybody) is often labelled a "certificate" when by far the more important thing inside it is a private key...
The same bundling idea can be done in PEM (the big blobs of Base 64 encoded stuff more often seen in Unix systems) by just concatenating a private key PEM to a certificate PEM. But at least if you look at it in a text editor you can see what somebody did, whereas in PFX it's a bit opaque without specialist tools.
Bundling these two things together most often happens in the context where somebody _else_ makes your key (and issues you a certificate). This is bad practice, not only is it patronising ("Poor dear don't understand crypto, we'll just give them the keys ready made") but it means the end user is vulnerable to copies kept nefariously or by accident. Avoid.
Certificate Authorities are prohibited from doing this for certificates in the Web PKI ("SSL certificates") but their resellers often still offer it as a "service". Say no. In fact try "Hell no".
Why's binary data a pain to transmit and how does base64 encoding help with that?
To bring this conversation back to PKI, base64 encoding means you can embed the GPG signature of your email body in the text itself, without having to deal with attachments like S/MIME does
They're also shipped with the browser, as is the case with Firefox. They may also be shipped with applications (some versions of Visual Studio seem to have their own certificate store).
>WARNING: This is just the tip of the iceberg. Also, ice is generally slippery. If your crypto knowledge amounts to the stuff presented here, stick to known, battle-tested implementations.
So. .. thanks!
A certificate identifies a name.
The signee is called a certificate authority (CA). The CA is often some big company, like VeriSign. With internal PKI, it can be any entity that nodes have been configured to trust.
A CA authenticates the identity claimed by a certificate.
A CA’s certificate can be signed by another CA, and so on. The last certificate in the chain is called a root certificate. Root certificates are trusted and stored locally. They’re usually shipped along the OS.
There can be a chain of trust whereby one CA authenticates the next.
Its a tricky subject and I don't think it is really possible to get too far beyond "give me £100 for this SSL certificate" ... "because" ... sigh [fill in your own SSL related conversations with your PHB stupidity here]