
Why EdDSA held up better than ECDSA against Minerva - dochtman
https://blog.cr.yp.to/20191024-eddsa.html
======
nly
DJB has spearheaded bringing practical, safe cryptography to software.

Does anyone else bridge academia and ruthlessly practical software engineering
quite as well?

~~~
segfaultbuserr
Half the Internet is now protected by djb's secure, robust, practical, high-
performance algorithms (Curve25519, Ed25519, ChaCha20, Poly1305), it's not an
overstatement, and it's a decisive victory in the Crypto War.

What I'm worrying now is the post-quantum future.

You see, elliptic curve cryptography was introduced around 1999 (NIST curves),
but it took 10 years to have a robust, secure, and practical algorithm
developed (Curve25519) in 2007, and another 10 years for tho public to accept
and deploy it in major systems in 2017. [0] And now the threat of quantum
computers is getting closer and closer, cryptographers are well-aware of that,
post-quantum cryptography is a huge area of research nowadays (just follow the
NIST competition).

If the past history repeats, it would mean that we'll have some usable post-
quantum public key algorithms soon, which will be secure in a theoretical
sense, but with huge number of pitfalls in practical systems, and it will take
another 20 years to deploy something really secure and robust in a practical
sense. And all the horrible stories of weak crypto, all the vulnerabilities,
and NSA, etc, will repeats again in the next two or three decades!

[0] I'm not advertising for djb, if it's not him, there will be something else
as well, but the point is...

~~~
pnako
How much of this stuff was validated through public competitions like for AES,
etc.?

I have no doubt he's a stellar cryptographer but it looks to me like there is
also some fanboyism from programmers with no particular crypto background.

~~~
dangerface
> How much of this stuff was validated through public competitions like for
> AES, etc.?

Curve25519, Ed25519, ChaCha20, Poly1305 have all been entered into
competitions like AES and usually get to the final round which is usually more
about politics than security.

The algorithms have also been validated by independents like google who where
one of the first to review, use and promote the algorithms.

> it looks to me like there is also some fanboyism from programmers with no
> particular crypto background.

Absolutely but we fan boy for him because he has been so widely reviewed with
stellar results.

~~~
floodyberry-
None of them have been in any competition like AES

------
3xblah
"Perhaps there haven't been enough tutorials showing how to implement
constant-time fixed-base scalar multiplication."

~~~
loup-vaillant
I have written one here: [http://loup-vaillant.fr/tutorials/fast-
scalarmult](http://loup-vaillant.fr/tutorials/fast-scalarmult)

It explains various constant time and variable times ways to do scalar
multiplication (assuming constant time point addition), and clearly marks what
is constant time (like combs) and what is variable time (like sliding
windows).

One big missing part is modular multiplication, needed to perform constant
time point addition. I explain some of it in my Poly1305 tutorial (search for
"Cheating at modular arithmetic"): [http://loup-
vaillant.fr/tutorials/poly1305-design](http://loup-
vaillant.fr/tutorials/poly1305-design)

------
floatingatoll
21 days ago | Minerva: Practically exploitable side-channel leakage in ECDSA
implementations (muni.cz)
[https://news.ycombinator.com/item?id=21147865](https://news.ycombinator.com/item?id=21147865)

------
MrXOR
djb: "I'm not being selective in my summary."

From Minerva research team's web page[1]:

Update: The EdDSA scalar multiplication code in libgcrypt was leaking, however
due to the way it was used, it was likely "not exploitable". It did not reduce
the scalar which was a SHA512 digest by the curve order, but used the digest
directly, thus the leakage did not represent the bit-length of the reduced
scalar. Thanks to Daniel J. Bernstein for the note.

[1] [https://minerva.crocs.fi.muni.cz/](https://minerva.crocs.fi.muni.cz/)

~~~
segfaultbuserr
The Curve25519/Ed25519 code in _libgcrypt_ is not a full constant-time
implementation and it greatly decreased the inherent security provided by
EdDSA's design, and already had timing problems in the past: in CVE-2017-0379,
we have a timing attack against GnuPG's Curve25519, it's possible to inject
malicious input with invalid curve points and observe the timings, hence
recover the private key "in as few as 11 attempts".

In other words, it was used in a way that djb never approved, so I don't think
djb should be responsible for that. And it's not that the libgcrypt developers
were stupid, but the due to how libgcrypt is architected, how bignum is
implemented, issues on portability, legacy code, there were many difficulties
to implement full constant-time code in libgcrypt.

In 2017, one developer said,

> dd9jn: Implementing constant time processing in a _portable_ way is really
> hard up to impossible. DJB steps most problems aside by writing curve
> specific code for certain CPUs. We are in the process of improving our code
> for commonly used curves by replicating the reference code. Using the
> reference code directly is not possible due to different ways of
> representing big integers and the fact that the reference code has zero
> comments. For the Gnuk token (hardware OpenPGP card) Gniibe (author) is even
> considering to move to a different MCU to have better control over the
> pipeline.

In the article, djb had an analysis as well,

> The real fix, the constant-time approach, would start by changing the
> interface to replace mpi_get_nbits(k) with a maxscalarbits specified by the
> caller. But this would require going through dozens of functions that call
> _gcry_mpi_ec_mul_point and figuring out the appropriate maxscalarbits for
> each. This is an example of tension between simplicity and security.

> There have been many other timing-attack vulnerabilities in libgcrypt, and
> clearly there will be more. We have to throw away variable-time crypto code
> without waiting for attacks to be demonstrated. For example, libgcrypt
> should use the constant-time ladders supported by Curve25519. But this
> doesn't mean that Minerva broke libgcrypt's Ed25519 implementation:
> libgcrypt was saved by another Ed25519 feature, the double-size hash.

~~~
antpls
It looks like cryptography would benefit a dedicated programming language,
aimed at verification and with other constraints such as "constant time"
backed in the syntax of the language.

At least, maybe it would ease code review and introduction to cryptography.

------
jfindley
I particularly enjoyed the section where he talks about the importance of
distributions for security, and how the product of two independent random
primes in the {2^1023,...,2^1024} is harder to factor than the same in the
{1,...,2^1024} space despite the latter having far more possibilities
available. His explanation of this is concise and clear, and while the point
is fairly simple once understood it's a great example of how counter-intuitive
this stuff can be.

~~~
tialaramex
You also have to be careful here because such unintuitive factors plus a
desire for optimisation can lead to grave security holes, and DJB links such a
thing in that section. ROCA (Return of Coppersmith's Attack) is a weakness
where Infineon designed a hardware library RSAlib that chooses numbers that
were much more likely to be prime, so that (as I understand it) their key
generation is nice and fast because it spends less time failing primality
tests and trying again.

Unfortunately it uses very particular smooth numbers (1 x 3 x 5 x 7 x 11 and
so on) to pick keys for each key size, and so an attacker who knows this
essentially get some bits from the private key for free, whereupon
Coppersmith's approach may turn that into a working attack.

The people who found ROCA started by examining the key pairs chosen by
different generators because they were interested in whether you can diagnose
who made a key based on what you see of the public key. They wrote a paper
about this showing that e.g. sometimes you'd get a certificate with a
particular 2048-bit RSA key in it and you could tell immediately oh, that's
from OpenSSL even though this cert is deployed on a Windows server with IIS,
so that's interesting. For example let's say you pick random 1024-bit integers
and multiply them together, but you insist on trying again if you get anything
other than a 2048-bit result, this will be annoyingly slow, it often fails, so
you of course optimise by picking only a specific range of integers where a
2048-bit result seems likely. Exactly how you do this may give away _at least_
exactly which implementation was used to make the keys, and as ROCA shows
maybe much more.

One of the clearly better things about public key cryptographic systems that
depend on arbitrary random integers as keys rather than primes/semi-primes
like RSA is that it's less tempting to build very complicated contraptions
like RSAlib which may conceal this sort of terrible flaw.

