> 3. For some historical reasons, many people feel the need to change
keys regularly. This is rather misguided: key rotation makes sense in
an army or spy network where there are many keys, and partial
compromissions are the normal and expected situation, so a spy network
must, by necessity, be in permanent self-cleansing recovery mode; when
there is a single key and the normal situation is that the key is NOT
compromised, changing it brings no tangible advantage.
This is a really strange advice from Thomas Pornin. People rotate keys because not doing so weakens most symmetric encryption schemes. For example while using AES-GCM with 96-bit nonces one needs to rotate keys after encrypting roughly 2^32 ~ 4 billion messages; otherwise the IV collision probability will be higher than 2^(-32), which is already high enough in most large scale systems (and really bad things happen when the IV is repeated).
Given a salted hash is being encrypted, who needs a nonce? The salt's already taken care of that, right?
Also, if you have 4 billion hashes stored, and you rotate the key, and you still have 4 billion hashes stored... What's changed? You would need a key ring or derivative keys I guess but I think this is actually a case where ECB does the job.
But I guess we've now proven the point that even a pepper is non-trivial.
POODLE worked not only against SSLv3, but also against any TLS implementations that check padding in SSLv3's style (e.g., just checking the last byte, and ignoring the rest of the padding). SSL accelerators from F5 and A10 were vulnerable. Thus, many of the world's largest sites were vulnerable.
Great question. The core crypto library in End-To-End has already supported ECDH on Curve25519 and Ed25519 since day one . We're discussing with GnuPG team on extending RFC 6637 to include these algorithms in the OpenPGP standard.
I spent the weekend digging into both End-to-End and OpenPGP.js, and it appears the End-to-End has the primitives for Ed25519, but it doesn't recognize signature packets that use Ed25519. Is there an e2e bug tracking compatibility with GPG 2.1 signatures that use Ed25519?
> GnuPG now support Elliptic Curve keys for public key encryption. This is defined in RFC-6637. Because there is no other mainstream OpenPGP implementation yet available which supports ECC, the use of such keys is still very limited. 
Google End-To-End  supports ECC by default. We are working on supporting Ed25519, but encrypting and signing with NIST curves should work and be compatible with GnuPG. That said, we've had a lot of compat issues, so it'll be great if Hacker News readers and GnuPG users can help test our implementation against GnuPG. You can generate keys in GnuPG and import them to End-To-End and vice versa, or you can encrypt/sign messages on one software and decrypt/verify them on the other. If you found a bug or anything that doesn't work as expected, you can report it at https://code.google.com/p/end-to-end/wiki/Issues?tm=3. If it's a security bug you're eligible for a monetary reward under our bug bounty program :-).
There are three big issues; in order of importance:
1. PGP's RSA constructions are archaic; they use a format defined in the 1990s that is vulnerable to multiple different attacks and likely to harbor more that we don't know about yet. (This, bafflingly, is also a problem with DNSSEC.) I should be clear: PGP is not itself known to be vulnerable to these attacks. But neither was Java's TLS implementation, before it was found to be vulnerable a few months back.
2. RSA is well-studied but it's hard to say how well we understand its strength. There are no credible attacks on RSA-2048, but academic progress is being made on a cousin of the factoring problem it relies on (the discrete log problem). ECC is based on a harder math problem, is also well studied, and is believed to be stronger.
3. ECC is faster and provides more security with fewer key bits.
A combination of all three of these factors gives a sort of second-order issue, which is that modern public key crypto constructions tend to be based on ECC and not multiplicative group IFP/DLP algorithms. EdDSA is good for reasons other than that it's based on good ECC crypto.
Hope that's helpful and not just noise. Looking forward to inevitable 'pbsd correction. :)
Key gen time probably doesn't matter too much in the particular case of pgp, but key size is a big deal. Keys that you might conceivably type by hand. Keys, not fingerprints, that can fit in tweets. Or tattoos. :)
> Sure you have control over the first N bytes. Look at the request-line: "GET /hello/world/this/is/my/url HTTP/1.1". Sure, you don't control the spaces, but you can assume any practical implementation uses a single space character. Combine that with control over the method (with XHR or statically through <form> or <img>) and the path, you're in business.
I like this book a lot, but you won't need any of this math until set 8. I spent a lot of term learning things like lattice basis reduction algorithms (I used Strang's linear algebra book and MIT lectures) only to discover that there really isn't a whole lot that requires you to break out linear algebra in day-to-day cryptography.
In particular: virtually all of block cipher crypto and message authentication relies on straightforward math. (It would be different if our challenges covered poly MACs, but we don't have good examples of common flaws in poly MAC implementations).
Yeah. Joux's attack is conceptually simple. You have 2 tags T_0, T_1, obtained with distinct messages and the same IV. This means T_0 = S_0 ^ X and T_1 = S_1 ^ X, where X is the same value for both. So you have T_0 ^ T_1 = S_0 ^ S_1. S_0 and S_1 are the polynomial evaluation of the ciphertext at H, the authentication key (which is also the same).
Now, via a simple polynomial evaluation property, you have f(x) + g(x) = (f + g)(x). We know f and g --- those are the two ciphertexts being authenticated here, interpreted as polynomials --- and we know that the polynomial f + g - S_0 - S_1 must be 0 at H. From there it's a matter of finding the roots of this polynomial, one of which is H, and this is the mathematically complicated part of the attack. Though you can treat root-finding as a black-box, the keywords here are Berlekamp or Cantor-Zassenhaus.
(Hopefully I didn't get this too wrong, I'm handwaving here)
I took a quick look at your implementation of ECDSA and I think it has a bug at line 311 . It looks like I could bypass the check if r or s is negative.
One thing that I don't understand is why big integer libraries developed exclusively for crypto need negative numbers. The library  that I contribute to doesn't need them, and it works just fine. Actually I could argue that having only non-negative numbers make it simpler and faster.
Its not really a bug, the operations after it would still be valid (it is almost immediately reduced to the field order), its just that those parameters would not be akin to the SEC paper specification.
I agree that the honus isn't on the users to check that though, so I'm probably going to make a pull request to change this.
Thanks for pointing this out, thankfully the implementation already failed on a negative s value, but you're correct in that it wasn't definitive.
I also whole-heartedly agree with your comment about the unnecessary inclusion of a bignum that allows for negative values.
The lack of typing in this (and other cases) has lead to several problematic scenarios for users to the point we have littered the code with assertions to enforce whatever we can.