Hacker News new | comments | show | ask | jobs | submit login
Handbook of Applied Cryptography (2001) (uwaterloo.ca)
82 points by sirsar 1621 days ago | hide | past | web | favorite | 19 comments



I'm an aspiring cryptographer and this book is hands-down one of the best references for applied cryptography out there today. Don't let the date fool you; cryptography is an area where maturity is well appreciated.

It's also fantastic that the whole thing is available online, but believe me when I say that the text is well worth the price if you ever find yourself needing a good cryptography reference. Some other good references are Introduction to Modern Cryptography by Katz and Lindell, Foundations of Cryptography by Goldreich (both volumes), and The Codebreakers by Kahn (for history of cryptography).

There are other good references, of course, but I find myself referring to the ones above the most. However, I am naturally biased towards texts that take a theory-based approach.


I guess you could argue that the advent of quantum computing is making everything old, like McEliece, new again. But I disagree with your premise. This book was written at a time when elliptic curves were more useful as a factoring method than as a cryptographic design tool.

Modern concepts, algorithms, and constructions that a designer might not have at their disposal if they rely on Menezes:

* Elliptic curve cryptography, which is supplanting RSA (and Rabin) and conventional DH, both of which are looking increasingly sketchy in 2013.

* Secure RSA padding for encryption (OAEP) and signatures (PSS), or, really, any discussions of the problems that motivate those formats --- Menezes does point out that you don't want to use low-public-exponent RSA with repeated messages, but a modern take on RSA would say that you don't ever want to send repeated messages at all.

* Counter mode, which is presented briefly as a small tweak to OFB mode with no description of its pitfalls; CTR is the second-most popular mode used today and increasingly the first choice of new systems.

* Modern bulk encryption --- XEX, XTS, &c.

* Encrypt-then-MAC (Menezes presents the opposite); in fact, modern crypto is at pains to ensure that encryption and integrity are co-specified and interoperate safely; a designer could be forgiven for reading Menezes and not including explicit integrity checks at all.

* The authenticated encryption modes CCM, GCM, EAX, and OCB.

* Modern CSPRNGs, including entropy collection and management and handling the cold-start problem.

* Any discussion of side channel attacks (in fact, Menezes predates Vaudenay and Bleichenbachers block cipher and RSA [respectively] error side channels so doesn't even discuss padding oracles).

It's not true that robust cryptography appreciates maturity. The ~twenty years since Menezes wrote HOAC haven't just been spent inventing whizz-bang new toys, but also in finding new ways to break the old toys. You're not always OK if you eschew the new stuff for the "mature" stuff; some of the mature tools are perilous to work with now.

That's an annoying thing about working with crypto (and a fun thing about breaking it) --- designers have to know what parts of the literature they need to adopt, like safe RSA padding schemes, and what parts they need to hold off on, like homomorphic encryption.

For what it's worth: I like Menezes a lot as a resource for practicing attacks. Menezes also does a much better job on theory than even _Practical Cryptography_ (which is a great book, and probably the only crypto book most people should own). It's a good book. Just be careful working from it.

(I'm not trying to call you out; I just think this is a fun thing to talk about.)


If I can expand on my own comment briefly, the thing I'd want to point out about that bulleted list is that they aren't new things you can apply interesting crypto things to, but modern constructions that address universal problems.

XEX and XTS, for instance, are used for disk encryption; disk encryption isn't a new problem, but it was poorly addressed by 1995 constructions, in ways that could create security-crippling tradeoffs.


> RSA (and Rabin) and conventional DH, both of which are looking increasingly sketchy in 2013.

May I ask why you say this? (Not agreeing nor disagreeing, just curious.)


Daniel Bernstein has been talking about advances in IFP, particularly the batch numeric field sieve, for several years now; the question seems to be when, not if, 1024 bit RSA keys will be arbitrarily attackable.

Meanwhile, over the past couple of months, there's been a flurry of activity on the DLP, particularly Antoine Joux' work on the index calculus approach. The IFP and DLP are intertwined problems, but RSA also depends on the hardness of the DLP, as does (obviously) DH and ElG.

The advances we're currently seeing do not appear to threaten the elliptic curve discrete log problem.

Quantum computing is an issue, but it looks like a far-off issue. The record for quantum factorization right now is 27, right? If you believe QC is a near-term threat, no mainstream number-theoretic (public key / key agreement) cryptosystem helps you; you need to be working with code-based or lattice-based crypto; nothing does this now.

The problems motivating the shift from RSA to ECC are near-term, not far-off like QC.

Shit, I didn't look who I was responding to. Ignore the tone; assume I'm addressing the thread, not you.


No problem. Agree that quantum is not really something to worry about, though it's a good thing that someone out there is thinking of alternatives.

By the way, the Batch NFS has seemingly fallen out of favor. Despite being significantly faster in the usual RAM computational model, it has worse AT complexity than factoring integers one by one (see section 5 of [1]), due to its crazy storage requirements. It doesn't change the fact, of course, that RSA-1024 doesn't have much shelf life left.

[1] http://eprint.iacr.org/2012/318


Elliptic curve cryptography isn't vulnerable to quantum computing. EDIT: This is mistaken. I was thinking of lattice-based cryptography, which isn't currently vulnerable to quantum computing.

In the video game Final Fantasy, there's a spell called "Doom" which places a countdown over your head. When it reaches 0, you die.

RSA is Doomed in exactly that sense: it's just a matter of time before RSA offers zero security, whereas elliptic curve cryptography remains (for now) unbroken.


It is [1]. Quantum computers, using Shor's algorithm, polynomially break any specialization of the abelian hidden subgroup problem; see [2] for a fairly complete list.

Whatever reason to prefer elliptic curves over integer factorization or discrete log-based schemes must be classical.

[1] http://arxiv.org/abs/quantph/0301141

[2] http://pqcrypto.org/quantum.html


Oh.

Then which (if any) algorithm is currently believed to be safe from quantum computing?

I think I was thinking of lattice-based cryptography.


There's a bunch of them. There's no known quantum algorithm for quickly decoding binary linear codes, so McEliece is one. The Clostest Vector Problem in linear algebra is another trapdoor that may be QC-resistent.

You didn't ask, but it's worth saying: block ciphers, stream ciphers and hash functions aren't thought to be fundamentally threatened by QC the way IFP and DLP number theoretic cryptosystems are.


It's easy-ish to move from RSA to ECC schemes, because they have significantly smaller key sizes and gain efficiency. The same isn't true of lattice schemes, which have significantly larger keys.

A switch to lattice or code-based crypto seems unlikely in the near future.


"it's just a matter of time before RSA offers zero security"

This is not the only possible outcome. Scalable quantum computing could turn out to be impractical due to cost. Or new physics could be found that rules out scalable quantum computing. Personally, I find both of these outcomes unlikely, the second more than the first, but they do exist.


I agree that I wouldn't treat the HAC as a "discovery" reference, for lack of a better term. Like you point out, there are modern-day concerns and constructions that are omitted. But I argue that if you are designing a protocol or choosing a primitive, you shouldn't have to consult a book to figure out which encryption mode to use or how to use a MAC. If you do, then you probably aren't qualified to be designing a protocol in the first place!

The HAC really excels as a specific, targeted reference for a cryptography-aware audience. For instance, if you're wanting a good, standard notation for something, the HAC is a great place to look. Or maybe you'd like to give a specific definition in a paper. (For instance, what is a precise definition of a MAC?) Then the HAC is a great place to look.

So, I agree that a designer shouldn't use the HAC as a sole reference, especially if they're unfamiliar with cryptography. This goes doubly so if what you're trying to do is estimate the security of a construction. But really, to know much about the cryptographic security of a scheme or construction, you really need to be up-to-date with the current research, not just looking at books. A book is a good first pass filter, if you will, but if you're judging security, you really want more.

> a designer could be forgiven for reading Menezes and not including explicit integrity checks at all.

Section 9.6.5 goes to great lengths to ensure the reader is fully aware that encryption doesn't provide integrity. Also, Menezes does present (an albeit brief) note on encrypt-then-MAC. I don't see the HAC as a teaching text, though. Anyway, I doubt there are many people who have "read" the HAC. It's a pretty thick book.

> It's not true that robust cryptography appreciates maturity. The ~twenty years since Menezes wrote HOAC haven't just been spent inventing whizz-bang new toys, but also in finding new ways to break the old toys. You're not always OK if you eschew the new stuff for the "mature" stuff; some of the mature tools are perilous to work with now.

I think robust cryptography absolutely appreciates maturity. But don't mistake me saying that for me saying that maturity is the only desired trait: far from it. scrypt was not recommended as highly as bcrypt for a few years after its release, and still one of the major selling points of bcrypt is "it's been around a while." And let's not forget the four-year period that the SHA3 competition spanned. Some minimum amount of maturity is required of all schemes that are considered secure.

So, maturity isn't a decision-making factor, no. But a construction that's been around for a long while and has seen virtually no attacks (but is widely studied) is essentially the gold standard. See AES. On the other hand, I think you'd have a hard time arguing that a relatively new construction that's seen virtually no attacks isn't less impressive, even if it has some improvements over an older scheme.

But again, I agree entirely that "mature" isn't always better. But it's definitely a factor, although whether it's a large or small factor mostly depends on the circumstances.

> what parts they need to hold off on, like homomorphic encryption

If you mean fully homomorphic encryption, then sure. However, we already have practical partially homomorphic schemes which are pretty good. But this area especially requires a cryptographer's assistance in design, even moreso than a typical "confidentiality/integrity" application.

---

Anyway, I guess ultimately, if you're just looking for a good text that's useful in learning cryptography, the HAC isn't the first place I'd look. But if you'd like an encyclopedic reference, it does a great job (IMO). If you're looking to build up a cryptography reference, though, it definitely needs to be supplemented with many other texts.

Edit: Also, another thing I find myself using the HAC for is to get a certain sort of historical context (regarding attacks and constructions) in an area before moving on to more advanced, more modern texts. Those more modern texts tend to leave out such details for brevity's sake, but I like knowing them all the same.


mkdir HoAC_tmp

cd HoAC_tmp

wget -r -np -l 1 -A pdf http://cacr.uwaterloo.ca/hac/ -nd

pdftk toc3.pdf chap1.pdf chap2.pdf chap3.pdf chap4.pdf chap5.pdf chap6.pdf chap7.pdf chap8.pdf chap9.pdf chap10.pdf chap11.pdf chap12.pdf chap13.pdf chap14.pdf chap15.pdf appendix.pdf references.pdf index.pdf cat output "../Handbook of Applied Crytopgraphy.pdf"

cd ..



It's in most linux repos too.


mkdir HoAC_tmp && cd $_ && wget -r -np -l 1 -A pdf http://cacr.uwaterloo.ca/hac/ -nd && pdftk toc3.pdf chap1.pdf chap2.pdf chap3.pdf chap4.pdf chap5.pdf chap6.pdf chap7.pdf chap8.pdf chap9.pdf chap10.pdf chap11.pdf chap12.pdf chap13.pdf chap14.pdf chap15.pdf appendix.pdf references.pdf index.pdf cat output "../Handbook of Applied Crytopgraphy.pdf" && cd .. && rm -r HoAC_tmp

This might be too much ... :)


Stapler is an alternative to pdftk with much lighter dependencies.

https://github.com/hellerbarde/stapler





Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: