
On the Impending Crypto Monoculture (2016) - dankohn1
https://lwn.net/Articles/681616/
======
niftich
As the article says, Bernstein's stuff won out because his work is at the
intersection of solid crypto, clean and performant code, and sane API design.
This, coupled with the NIST curve debacle that shook public confidence, makes
for a compelling case.

Part of the problem is there is often a wide gulf between cryptographers and
the authors of libraries who then go on to use those algorithms. This is
perhaps more acute with approved-by-committee, standardized crypto because
library authors may have to implement it to be compatible, despite not being
experts in the particular algorithm.

In Bernstein's case, he ships (through NaCl) a reference implementation that
is good enough to be used in production, and packages complete constructs in
ways that are resistant to misuse. This is his innovation -- other libraries
tend to offer only the primitives and rely on the user to chain them together
in ways that make sense and never in the ways that don't.

There are other cryptographers that ship decent reference implementations --
the Keccak team being one -- but their work is largely restricted to sponges
and any construct that can be formed out of it (e.g. PRFs, stream ciphers,
hash functions). There is perhaps a shortage of people who look at holistic
cryptosystems and possess both the domain knowledge and the ideal attitude
towards misuse-resistant design.

~~~
zedred
> _As the article says, Bernstein 's stuff won out because his work is at the
> intersection of solid crypto, clean and performant code, and sane API
> design._

As a casual observer, my impression has been pretty different. Here's an
excerpt from the README of curve25519-donna, which it seemed like everyone was
using for a while:

 _curve25519 is an elliptic curve, developed by Dan Bernstein, for fast
Diffie-Hellman key agreement. DJB 's original implementation was written in a
language of his own devising called qhasm. The original qhasm source isn't
available, only the x86 32-bit assembly output.

Since many x86 systems are now 64-bit, and portability is important, this
project provides alternative implementations for other platforms._

My impression has always been that what we get from DJB is some wacky
implementation written in a language of his own devising, or just the 32bit
assembler output of that, or some partial code fragment that has to be
disentangled from his benchmarking library, and the only thing that makes this
usable are people who are motivated to do the work of making it digestible by
mortals.

~~~
tptacek
I'm not sure we need to litigate this, because it's not like John Viega and
David McGrew contributed the production versions of AES-GCM that everyone
uses.

More importantly: whatever you think of Bernstein's packaging, an area of
expertise he clearly shares with just a small subset of cryptographers is the
design of cryptographic primitives optimized for consumer compute hardware.
There's a reason his primitives tend to outperform the ones they supplant:
until relatively recently, Bernstein was the cryptographer who took this
challenge most seriously.

Finally: whatever you might think of things like qhasm, it's just a fact that
the only mainstream crypto library a majority of crypto engineers are
comfortable having generalist developers use is designed (in part) by
Bernstein. When you use libsodium, you're (usually) using programming
interfaces and constructions he designed.

~~~
bitexploder
It is also worth mentioning it is all public domain.

He has gone to great lengths to ensure the algorithms are all side channel
resistant. The breadth of his concern and the care behind the decision making
is really impressive and most users of his software only really understand the
tip/visible portions of it all.

I will forebear the idiosyncrasies, gladly, to get all the benefits compared
to the current stew of crypto primitives I see getting misused almost
constantly.

------
loup-vaillant
I get the point, but the current state of affair encourages me to make it even
worse. I'm currently working on a tweetNaCl-like library that will integrate
not only Chacha20, but also Blake2b and Argon2i, based on the same ideas.

The repeated application of simple ARX based rounds is a damn good idea: it's
efficient even for software implementations, it's often "naturally" constant
time, and it's _simple_. Implementing this stuff is easy: if it's Valgrind
clean and passes the test vectors, it probably works. It also makes audits and
reviews much easier.

Similarly, Wegman-Carter authenticators such as Poly1305 are kind of a no-
brainer: no forgeries are possible unless you break the cipher itself! Also,
fast. Modular arithmetic and potential limb overflow makes them harder to get
right but still.

Finally, there is simply no escaping elliptic curves. I don't trust the NIST
curves. 25519 is not too complex, fast, and doesn't leave enough space in its
constants for a backdoor (DJB made sure the design constraints were tight).

I mostly agree with this article, but I don't mind a monoculture if it means I
get boring crypto.

~~~
tptacek
None of "simple", "fast", and "not backdoored" capture the real reason why
Curve25519 is so deservedly popular; the real issue is that Curve25519 is
misuse-resistant: random strings are, for 25519 ECDH, valid public keys, and
the arithmetic rules for the curve he chose permit straightforward constant-
time implementations.

I haven't spoken to a cryptographer who believes the NIST curves are
backdoored (or, for that matter, in any of the elaborate hocus-pocus around
the Brainpool curves being untrustworthy). But the NIST curves are difficult
to use safely, which is why nobody likes them.

The NIST backdoor stuff is an argument that appeals more to message board
nerds than to practitioners.

(Is NIST itself trustworthy? Fuck no. The Dual_EC CSPRNG seems pretty clearly
to have been a backdoor at this point.)

~~~
throwawayish
I don't think the backdoor in the NIST curves, _if any_ , is that the curves
themselves are not secure, but that they are so hard to implement that it's
practically guaranteed that implementations have more-or-less critical issues
/ side channels.

~~~
tptacek
But that's pretty much true of all the curves of that vintage, right?

~~~
throwawayish
Yes. It's also hard to know whether "they" didn't try "hard enough" to get
"better" curves or whether "they" didn't know any better(†) or whether "they"
intentionally made these "bad" curves. That's why I don't want to call it a
backdoor.

(†) We know that _historically_ NSA had secret cryptography knowledge, the
most famous example being differential cryptanalysis and the subsequent
hardening of DES (although IBM also knew about that). Since it's a secret
agency we cannot know the current state of their secret knowledge; my gut
tells me though that since crypto has become a much more open/scientific
discipline in the last 10-15 years that they probably don't have much of that
anymore.

------
theandrewbailey
> What's more, the reference implementations of these algorithms also come
> from Dan Bernstein (again with help from others), leading to a never-before-
> seen crypto monoculture in which it's possible that the entire algorithm
> suite used by a security protocol, and the entire implementation of that
> suite, all originate from one person.

We've had crypto monoculture before. 15-20 years ago, we were using RSA, RC4,
and MD5. Ron Rivest had a large hand in all of them. They weren't bad for the
time, and the bad parts weren't bad because of common authorship.

~~~
tptacek
Let's not forget that the designer of Rijndael (AES) is also the designer of
Keccak (SHA-3).

RC4, by the way, was bad even for the time. It would be worth studying how it
managed to survive as long as it did, because flaws in RC4 were well known
long before the browser drama.

~~~
aidenn0
RC4 was a state of the art stream cipher in 1987; that it took so long for a
practical attack to appear, despite its wide use, is a testament to the fact
that it was clearly not "bad even for the time"

~~~
tptacek
RC4 was a trade secret unknown to the rest of the industry until it was leaked
on Usenet in the mid-90a.

------
tptacek
Reminder, because the term "monoculture" has a powerful negative valence for
those of us who remember Slashdot from the 1990s: Guttman isn't arguing that
this is a dangerous development, but rather that it's sad that the rest of the
cryptographic research community hasn't given Bernstein a run for his money.

I suspect the situation will look different 10 years from now: CAESAR will
probably spit out a couple strong non-Bernstein AEADs, for instance. And
remember, we're still not using a Bernstein hash (though BLAKE traces back to
Salsa).

~~~
loup-vaillant
> _BLAKE traces back to Salsa_

Just reviewed some Blake2b, the round structure even integrates the Chacha20
improvements: the quarter-round has the same structure (besides the extra 2
parameters), and their application follows the same SIMD friendly
column/diagonal rounds we see in Chacha20.

------
cshep
I have the fortune of working in a security group who's members are recognised
themselves and have worked with Bernstein, Rogaway et al.

From my experience, too many cryptographers lack the applied skills (API
design, knowing the issues faced by developers, designing and implementing
performant crypto primitives). Conversely, too many applied folk lack the
crypto experience: knowing the state-of-the-art of elliptic curves, MPC,
lattices and so forth.

Bernstein has the experience to bridge both, which provides an enormous
advantage. JHU's Matthew Green is someone else who's does both.

------
makmanalp
For those newbies like me wondering about AEADs:

[https://www.imperialviolet.org/2015/05/16/aeads.html](https://www.imperialviolet.org/2015/05/16/aeads.html)

~~~
cesarb
Another good one:

[https://blog.cryptographyengineering.com/2012/05/19/how-
to-c...](https://blog.cryptographyengineering.com/2012/05/19/how-to-choose-
authenticated-encryption/)

------
lstamour
FYI, discussion 10 months ago:
[https://news.ycombinator.com/item?id=11355742](https://news.ycombinator.com/item?id=11355742)

------
michaelt

      the single biggest, indeed killer failure 
      of the [GCM], the fact that if you for some
      reason fail to increment the counter, you're
      sending what's effectively plaintext (it's
      recoverable with a simple XOR).
    

Is this property particularly unique to GCM? I'd have thought all encryption
algorithms would fail if you didn't give them the right inputs?

~~~
geofft
The ways in which encryption algorithms fail with bad inputs vary widely.

One older and real-world example is DSA vs. RSA: if you use the same random
number twice when making DSA signatures, an attacker can recover the private
key. Or if you use different random numbers, but you don't keep them secret,
an attacker can also recover the private key. This is a real problem if you
happen to use, say, Debian when it had the the OpenSSL RNG bug. RSA has no
such flaw; the attacker would need you to have made a series of much-more-
unlikely mistakes to recover a private key out of RSA.

See also
[https://www.imperialviolet.org/2013/06/15/suddendeathentropy...](https://www.imperialviolet.org/2013/06/15/suddendeathentropy.html)

If you're wondering why GPG, OpenSSH, etc. started defaulting to RSA in the
past few years, that's why. (If you're wondering why they originally picked
DSA, patents.)

~~~
nly
Worth mentioning that ECDSA suffers the same problems as classic DSA if you
reuse a nonce (for different messages). This lead to the extraction of a
Playstation 3 signing key.

Why does DSA suck? Patents. It was designed to work around patents in the much
simpler and Schnorr signature scheme (which is immune to nonce reuse due to
the way message content and nonce are hashed together).

------
dvdhnt
> In adopting the Bernstein algorithm suite and its implementation,
> implementers have rejected both the highly brittle and failure-prone current
> algorithms and mechanisms and their equally brittle and failure-prone
> implementations.

Right for the throat.

------
cweimann
So djb writes crypto code that works and everybody switches to it but it isn't
because his stuff is any good? Through the entire article he doesn't list
anything wrong with the djb crypto stuff and ends up calling it 'brakish' and
'camel dung'? Really? It isn't because djb stuff is good, it is because
everything else is crap? WTF?

~~~
__jal
Gutmann writes a bit colloquially, and the text was a message on a somewhat
informal mailing list. So keep that in mind regarding the tone.

The overall point of the post was not to blast DJB, but, well, to point out
the crypto monoculture, like it says on the tin. DJB's stuff _is_ good, which
is why it is popular. The problem Gutmann is highlighting (mostly) is that
nobody's competing with him.

That problem? Monocultures are high-risk. A bug or a cryptanalysis
breakthrough could render "everybody's" security broken at the same time. Or,
DJB is eaten by a bear; who supports that code now?

~~~
tptacek
If DJB is eaten by a bear I think we'll manage to use Ed25519 and Chapoly just
fine

Moreover, you can't cryptanalyze Daniel Bernstein himself to break Curve25519
or Chapoly, so he's not a single point of failure. :)

------
onlydnaq
To be honest, nonce reuse with Bernsteins authenticated encryption algorithms
will lead to the same problem as those the author points out with GCM (i.e.
plaintext recovery). However, the biggest issue with GCM isn't that the
plaintext leaks when reusing nonces, it's the fact that reusing nonces leads
to an attacker being able to forge arbitrary ciphertexts.

~~~
loup-vaillant
But… Poly1305 has the same "problem"…

~~~
tptacek
Guttman's wording here is imprecise. GCM and Poly1305 are not comparably
brittle. Both have nonce misuse issues, but GCM has additional problems. See:

[https://news.ycombinator.com/item?id=13384762](https://news.ycombinator.com/item?id=13384762)

------
Ruud-v-A
> Compare this with old-fashioned CBC+HMAC (applied in the correct EtM
> manner), in which you can arbitrarily misuse the IV (for example you can
> forget to apply it completely) and the worst that can happen is that you
> drop back to ECB mode, which isn't perfect but still a long way from the
> total failure that you get with GCM.

It is not. As Dan Boneh stresses in his cryptography course, a cryptosystem is
either secure or “terribly, terribly, insecure”.

------
gonzo
> "This isn't just theoretical, it actually happened to Colin Percival, a very
> experienced crypto developer, in his backup program tarsnap."

Colin was using AESCTR+HMAC rather than GCM, but there was a nonce reuse bug
six years ago:

[http://www.daemonology.net/blog/2011-01-18-tarsnap-
critical-...](http://www.daemonology.net/blog/2011-01-18-tarsnap-critical-
security-bug.html)

------
TazeTSchnitzel
(2016)

------
tempz
TLAs will love this. Only one stack to break and/or subvert.

------
khana
Why personify crypto? Let the math alone be the driver in this domain. What is
solid and true will eventually trickle up.

~~~
geofft
> _Let the math alone be the driver in this domain. What is solid and true
> will eventually trickle up._

Unless you have some way of proving that one-way functions exist, which would
imply as a consequence P ≠ NP, this is literally untrue (unless by
"eventually" you mean "one day centuries hence, maybe we'll prove P ≠ NP").
There is not a single piece of crypto that can be proven correct today; all
crypto that we use is safe because mathematicians have so far been unable to
find a practical way to break.

This requires mathematicians to be _actively trying_ to break the crypto, so
personalities matter because you can't get a worldwide community of experts to
study every random person's algorithm. This also requires mathematicians to
have a sense of which ways a particular algorithm might be fragile, even if
they can't break it quite yet, which requires some non-mathematically-provable
trust in the cryptanalyst's good sense.

And, of course, if you believe that a certain cryptographer might be inserting
back doors or prone to approving crypto that has back doors (see for instance
Dual_EC_DRBG), you might distrust other output from that same cryptographer
even if you can't prove anything yet.

~~~
lederhosen
Not only is One-time pad proven secure, it is also obvious without any
knowledge of math.

~~~
geofft
It is not secure: it doesn't provide authentication (integrity protection) and
it isn't nonce-reuse-resistant.

~~~
loup-vaillant
Integrity can be provided with a Wegman-Carter scheme, which come with a
security reduction. The combination one-time pad + poly1305 for instance is
provably secure. It's just _impractical_ , and you have to rely on your
chacha20 random number generator from /dev/urandom anyway.

Also good luck finding a nonce-reuse resistant algorithm.

~~~
geofft
Does the correctness of Poly1305 not depend on P ≠ NP? I guess this boils down
to whether the correctness of AES depends on P ≠ NP, which it might not since
it's symmetric?

The failure modes of reusing a nonce are never good, but some are far worse
than others. The one-time pad's failure mode is particularly disastrous; it
leaks the XOR of two plaintexts. There are a good number of algorithms that
only leak whether the same message was repeated.

~~~
loup-vaillant
> _Does the correctness of Poly1305 not depend on P ≠ NP?_

Not that I know of. As far as I know the security reductions are
unconditional.

Even worse than revealing the XOR of 2 plaintexts is the key recovery enabled
by GCM (source: other comments in this thread). And if you're that scared of
nonce reuse, just

Nevertheless, it doesn't matter. Avoiding nonce reuse is easy. And if you're
really scared, just use a random nonce from a big enough space (192 bits, like
XChacha20, are good). That way if you have access to a random source, you can
turn any encryption algorithm into a nonce-less algorithm.

