It's also fantastic that the whole thing is available online, but believe me when I say that the text is well worth the price if you ever find yourself needing a good cryptography reference. Some other good references are Introduction to Modern Cryptography by Katz and Lindell, Foundations of Cryptography by Goldreich (both volumes), and The Codebreakers by Kahn (for history of cryptography).
There are other good references, of course, but I find myself referring to the ones above the most. However, I am naturally biased towards texts that take a theory-based approach.
Modern concepts, algorithms, and constructions that a designer might not have at their disposal if they rely on Menezes:
* Elliptic curve cryptography, which is supplanting RSA (and Rabin) and conventional DH, both of which are looking increasingly sketchy in 2013.
* Secure RSA padding for encryption (OAEP) and signatures (PSS), or, really, any discussions of the problems that motivate those formats --- Menezes does point out that you don't want to use low-public-exponent RSA with repeated messages, but a modern take on RSA would say that you don't ever want to send repeated messages at all.
* Counter mode, which is presented briefly as a small tweak to OFB mode with no description of its pitfalls; CTR is the second-most popular mode used today and increasingly the first choice of new systems.
* Modern bulk encryption --- XEX, XTS, &c.
* Encrypt-then-MAC (Menezes presents the opposite); in fact, modern crypto is at pains to ensure that encryption and integrity are co-specified and interoperate safely; a designer could be forgiven for reading Menezes and not including explicit integrity checks at all.
* The authenticated encryption modes CCM, GCM, EAX, and OCB.
* Modern CSPRNGs, including entropy collection and management and handling the cold-start problem.
* Any discussion of side channel attacks (in fact, Menezes predates Vaudenay and Bleichenbachers block cipher and RSA [respectively] error side channels so doesn't even discuss padding oracles).
It's not true that robust cryptography appreciates maturity. The ~twenty years since Menezes wrote HOAC haven't just been spent inventing whizz-bang new toys, but also in finding new ways to break the old toys. You're not always OK if you eschew the new stuff for the "mature" stuff; some of the mature tools are perilous to work with now.
That's an annoying thing about working with crypto (and a fun thing about breaking it) --- designers have to know what parts of the literature they need to adopt, like safe RSA padding schemes, and what parts they need to hold off on, like homomorphic encryption.
For what it's worth: I like Menezes a lot as a resource for practicing attacks. Menezes also does a much better job on theory than even _Practical Cryptography_ (which is a great book, and probably the only crypto book most people should own). It's a good book. Just be careful working from it.
(I'm not trying to call you out; I just think this is a fun thing to talk about.)
XEX and XTS, for instance, are used for disk encryption; disk encryption isn't a new problem, but it was poorly addressed by 1995 constructions, in ways that could create security-crippling tradeoffs.
May I ask why you say this? (Not agreeing nor disagreeing, just curious.)
Meanwhile, over the past couple of months, there's been a flurry of activity on the DLP, particularly Antoine Joux' work on the index calculus approach. The IFP and DLP are intertwined problems, but RSA also depends on the hardness of the DLP, as does (obviously) DH and ElG.
The advances we're currently seeing do not appear to threaten the elliptic curve discrete log problem.
Quantum computing is an issue, but it looks like a far-off issue. The record for quantum factorization right now is 27, right? If you believe QC is a near-term threat, no mainstream number-theoretic (public key / key agreement) cryptosystem helps you; you need to be working with code-based or lattice-based crypto; nothing does this now.
The problems motivating the shift from RSA to ECC are near-term, not far-off like QC.
Shit, I didn't look who I was responding to. Ignore the tone; assume I'm addressing the thread, not you.
By the way, the Batch NFS has seemingly fallen out of favor. Despite being significantly faster in the usual RAM computational model, it has worse AT complexity than factoring integers one by one (see section 5 of ), due to its crazy storage requirements. It doesn't change the fact, of course, that RSA-1024 doesn't have much shelf life left.
In the video game Final Fantasy, there's a spell called "Doom" which places a countdown over your head. When it reaches 0, you die.
RSA is Doomed in exactly that sense: it's just a matter of time before RSA offers zero security, whereas elliptic curve cryptography remains (for now) unbroken.
Whatever reason to prefer elliptic curves over integer factorization or discrete log-based schemes must be classical.
Then which (if any) algorithm is currently believed to be safe from quantum computing?
I think I was thinking of lattice-based cryptography.
You didn't ask, but it's worth saying: block ciphers, stream ciphers and hash functions aren't thought to be fundamentally threatened by QC the way IFP and DLP number theoretic cryptosystems are.
A switch to lattice or code-based crypto seems unlikely in the near future.
This is not the only possible outcome. Scalable quantum computing could turn out to be impractical due to cost. Or new physics could be found that rules out scalable quantum computing. Personally, I find both of these outcomes unlikely, the second more than the first, but they do exist.
The HAC really excels as a specific, targeted reference for a cryptography-aware audience. For instance, if you're wanting a good, standard notation for something, the HAC is a great place to look. Or maybe you'd like to give a specific definition in a paper. (For instance, what is a precise definition of a MAC?) Then the HAC is a great place to look.
So, I agree that a designer shouldn't use the HAC as a sole reference, especially if they're unfamiliar with cryptography. This goes doubly so if what you're trying to do is estimate the security of a construction. But really, to know much about the cryptographic security of a scheme or construction, you really need to be up-to-date with the current research, not just looking at books. A book is a good first pass filter, if you will, but if you're judging security, you really want more.
> a designer could be forgiven for reading Menezes and not including explicit integrity checks at all.
Section 9.6.5 goes to great lengths to ensure the reader is fully aware that encryption doesn't provide integrity. Also, Menezes does present (an albeit brief) note on encrypt-then-MAC. I don't see the HAC as a teaching text, though. Anyway, I doubt there are many people who have "read" the HAC. It's a pretty thick book.
> It's not true that robust cryptography appreciates maturity. The ~twenty years since Menezes wrote HOAC haven't just been spent inventing whizz-bang new toys, but also in finding new ways to break the old toys. You're not always OK if you eschew the new stuff for the "mature" stuff; some of the mature tools are perilous to work with now.
I think robust cryptography absolutely appreciates maturity. But don't mistake me saying that for me saying that maturity is the only desired trait: far from it. scrypt was not recommended as highly as bcrypt for a few years after its release, and still one of the major selling points of bcrypt is "it's been around a while." And let's not forget the four-year period that the SHA3 competition spanned. Some minimum amount of maturity is required of all schemes that are considered secure.
So, maturity isn't a decision-making factor, no. But a construction that's been around for a long while and has seen virtually no attacks (but is widely studied) is essentially the gold standard. See AES. On the other hand, I think you'd have a hard time arguing that a relatively new construction that's seen virtually no attacks isn't less impressive, even if it has some improvements over an older scheme.
But again, I agree entirely that "mature" isn't always better. But it's definitely a factor, although whether it's a large or small factor mostly depends on the circumstances.
> what parts they need to hold off on, like homomorphic encryption
If you mean fully homomorphic encryption, then sure. However, we already have practical partially homomorphic schemes which are pretty good. But this area especially requires a cryptographer's assistance in design, even moreso than a typical "confidentiality/integrity" application.
Anyway, I guess ultimately, if you're just looking for a good text that's useful in learning cryptography, the HAC isn't the first place I'd look. But if you'd like an encyclopedic reference, it does a great job (IMO). If you're looking to build up a cryptography reference, though, it definitely needs to be supplemented with many other texts.
Edit: Also, another thing I find myself using the HAC for is to get a certain sort of historical context (regarding attacks and constructions) in an area before moving on to more advanced, more modern texts. Those more modern texts tend to leave out such details for brevity's sake, but I like knowing them all the same.
wget -r -np -l 1 -A pdf http://cacr.uwaterloo.ca/hac/ -nd
pdftk toc3.pdf chap1.pdf chap2.pdf chap3.pdf chap4.pdf chap5.pdf chap6.pdf chap7.pdf chap8.pdf chap9.pdf chap10.pdf chap11.pdf chap12.pdf chap13.pdf chap14.pdf chap15.pdf appendix.pdf references.pdf index.pdf cat output "../Handbook of Applied Crytopgraphy.pdf"
This might be too much ... :)