
Cofactor Explained: Clearing Elliptic Curves' dirty little secret - loup-vaillant
http://loup-vaillant.fr/tutorials/cofactor
======
shinigami
It's sad that efficient complete formulas for Weierstrass curves were found
only after Curve25519 was well established. Now we are stuck with all these
cofactor issues. Ristretto is nice but so terribly complex:
[https://ristretto.group/details/isogenies.html](https://ristretto.group/details/isogenies.html)

~~~
beefhash
> _efficient complete formulas for Weierstrass curves were found only after
> Curve25519 was well established_

Assuming you are talking about the Renes–Costello–Batina formulas, they're
complete, but not necessarily efficient. According to [1], optimized short
Weierstrass with the complete formulas is still 1.5 to 3 times slower than
Curve25519. I imagine the numbers won't be much better for Edwards25519,
either. There's definitely a ton of potential for a better complete addition
formula on Weierstrass still left.

> _Ristretto is nice but so terribly complex_

Ristretto is nice, terribly complex, and you don't actually need to care about
the _conceptual_ complexity. As an implementer, your _only_ job is to execute
the explicit formulas in section 5 of the Ristretto website. You do not have
to be able to follow the hard math (just how you do not have to be able to
follow the hard math involved in making the explicit formulas). Plus the
entire thing can be trivially constant-time given a constant-time selection
primitive and constant-time field arithmetic. It's not that much more
difficult than doing regular point compression on your own.

[1] Peter Schwabe, Daan Sprenkels. The complete cost of cofactor h=1
(published in INDOCRYPT19),
[https://eprint.iacr.org/2019/1166.pdf](https://eprint.iacr.org/2019/1166.pdf)

~~~
cryptbe
>Ristretto is nice, terribly complex, and you don't actually need to care
about the conceptual complexity. As an implementer, your only job is to
execute the explicit formulas in section 5 of the Ristretto website. You do
not have to be able to follow the hard math (just how you do not have to be
able to follow the hard math involved in making the explicit formulas).

I don't think one should blindly follow an instruction without understanding
why in any fields, let alone in crypto where a small, subtle difference can
make or break it. Also, understanding crypto requires less math than inventing
(and attacking) crypto, so it takes some effort, but it's doable even for
hobbyists. If the math makes one uncomfortable, maybe one shouldn't try to
roll their own crypto for production use in the first place.

Case in point: the author of this article that we're commenting on made a
deadly mistake because they did not understand the math behind point
conversion between Ed25519 and Curve25519 [1].

Below I also point out a mistake in their claim about malleability in EdDSA.

[1]
[https://www.reddit.com/r/crypto/comments/8toywt/critical_vul...](https://www.reddit.com/r/crypto/comments/8toywt/critical_vulnerability_in_monocypher_full/)

~~~
shinigami
That's a good example of how a "SafeCurve" caused a vulnerability that
wouldn't exist in Weierstrass curve.

But many smart people made many such mistakes in the past. If we gatekeep it
to much then we won't have anyone left to implement crypto.

~~~
beefhash
Maybe we _should_ gatekeep it so much though. As long as there exist at least
two people capable of implementation _per programming language_ (one to
implement, another to audit), there will only ever be one, single, canonical
implementation and there's no way around it. It is not and should not be an
inherent right to be allowed to implement cryptography (that is put into
production or made publicly available). The gatekeeping is there for a reason
and it's important that we uphold it. Fewer implementations means that more
people will be focused on having to write and check less code overall. Patents
could be used to help with this by only permitting one upstream implementation
to exist, but that's now how they end up being used in practice, and that's
ignoring the fact that patent expiry is impractically short (compared to
copyright expiry especially so).

~~~
shinigami
That simply does not work in the real world. Also, why does this only applies
to crypto? A RCE vuln can have a much larger impact than mishandling
cofactors. Should we have canonical implementations of every piece of software
imaginable?

~~~
naniwaduni
we should leave the task of left-padding a string to a popular, no doubt well-
tested library

------
riobard
Could you please describe the steps to generate key pairs with random public
part using ristretto?

------
cryptonector
Er, TFA's description of EdDSA is incorrect. There are no nonces in EdDSA.

~~~
beefhash
It's a _simplified_ description of EdDSA (and as much is said in the article),
taking out the deterministic part for a random nonce, effectively presenting
key-prefixed Schnorr signatures instead.

------
ljhsiung
Shouldn't it be

> 1 = 12 + 33 = 4.3 + 11.3

instead of

> 1 = 12 + 33 = 1.3 + 11.3

Neat read :) learned a bit!

~~~
loup-vaillant
Oops, my bad. Correcting now, thanks.

------
cryptbe
tl;dr: the conclusion of the EdDSA part of the article is wrong. EdDSA has its
problems, but malleability is not one of them, if one is following RFC 8032.

>There are several ways to sign a document with EdDSA, and produce a valid
signature. The three sources of malleability are:

>We can add a multiple of L (the order of the prime subgroup) to s. Recall
that B has order L as well, so it will absorb any cofactor. Basically
everything happens modulo L, so adding L won't change a thing.

This is prevented in RFC 8032 by checking that 0 <= s < L. Tink [1] does this
check, and therefore is malleability free.

>We can sign the same message with a different nonce r. This requires
knowledge of the secret key a.

This proves that the signature of a message is not unique -- that is the
signer can product multiple signatures -- but I haven't seen anyone calling
this a malleability issue. Malleability is about taking a triple (public key,
signature message) and tweaking bits to produce another valid triple.

>We can add a low order point to A, and subtract it from R. That way we
produce a valid signature, but from a public key nobody vouches for. If the
verifier checks the weaker equation, we can add a low order point to just R,
and produce a "valid" signature with the same public key.

This is the second time I saw this claim. When I first saw it [2], I thought,
wait, this test is missing in Wycheproof [3], but Bleichenbacher does NOT miss
anything. That's when I knew it's wrong.

Modifying A or R instantly makes the signature invalid. Using the article's
notation, the signature is validated by checking that

B.s.8 == R.8 + A.h.8, where h = SHA-512(R, A, M) and M is the message

Multiplying the cofactor or not doesn't matter, doesn't introduce any
weakness, as implied by the article. If R or A is changed, h will change.

[1] [https://github.com/google/tink](https://github.com/google/tink)

[2] [https://github.com/dalek-
cryptography/ed25519-dalek/issues/2...](https://github.com/dalek-
cryptography/ed25519-dalek/issues/20)

[3]
[https://github.com/google/wycheproof](https://github.com/google/wycheproof).
Wycheproof has tests for EdDSA malleability, but it only tests if s is not
properly range checked.

~~~
shinigami
> malleability is not one of them, if one is following RFC 8032

This is like claiming Weierstrass curves don't have any problems if you follow
the NIST/SECG standards. The whole point of the "SafeCurves" it to be easier
to get them right, but you can still get them wrong.

~~~
cryptbe
If one implements EdDSA, but does not follow RFC 8032, one is doing it wrong
on multiple levels.

~~~
loup-vaillant
Daniel J. Bernstein et al's original paper (High-Speed High-Security
Signatures), didn't feel the need to take as many precautions as RFC 8032. For
instance, they didn't care about malleability.

You seem to think they should have. May I ask why?

~~~
tptacek
I'm interested in Thai's response to this question too, as I would be in any
comment anyone managed to solicit from him about this topic, but an easy point
to make here is that Bernstein was himself involved in RFC 8032, at least as a
reviewer and contributor to the process, as you can quickly learn by reading
the CFRG mailing list.

~~~
loup-vaillant
Oh, so DJB changed his mind then? Makes sense considering the application of
EdDSA beyond signatures (I've heard malleability is a problem with zero-
knowledge proofs, but I haven't studied that subject).

