Hacker News new | past | comments | ask | show | jobs | submit login

> Why are ED25519 keys better than RSA

Two reasons: 1) they are a lot shorter for the same level of security and 2) any random number can be an Ed25519 key. To generate an RSA you have to generate two large random primes, and the code that does this is complicated an so can more easily be (and in the past has been) compromised to generate weak keys.

If I understood it correctly, you're saying that RSA requires the two numbers to be big AND random, otherwise the algorithm isn't strong? Therefore Ed25519 is better because it's strong regardless of the key?

What I don't get then is how can a short key be secure, that goes against what I was taught in college. Aren't shorter keys more prone to collisions and bruteforce attacks?

Aren't shorter keys more prone to collisions and bruteforce attacks?

Given the same cipher, more or less, yes. Between ciphers, though, key-lengths are less relevant, and the differences in those ciphers become more so.

EDIT: Think of it in terms of Shannon Entropy: because RSA requires a pair of primes, the keyspace is so much sparser — that is to say, more "predictable" (if, granted, at a mostly theoretical level) — so keys need to be that much larger to be secure.

Contrarily, with ED25519, keys can be smaller, because the keyspace is denser.

(Or am I just talking out my ass here?)

EDIT 2: s/smaller/sparser/, s/bigger/denser/, regarding keyspaces. Thanks, 'lisper!

Not bigger -- denser.

Great replies, I got it now, it makes sense. Thanks to both of you!

RSA requires two numbers which are big and random and prime. It is the primality that makes things tricky. Generating random primes is a lot harder than generating random numbers.

The other factor (no pun intended) that makes RSA keys large is that there are more efficient algorithms for factoring than there are for solving the elliptic curve discrete log problem, e.g. https://en.wikipedia.org/wiki/General_number_field_sieve If you crunch the numbers on this you will find that a 2000-bit RSA key has a security level of about 100 bits, i.e. it takes about 2^100 operations to factor a 2000-bit RSA key using GNFS.

Not disagreeing, but I think both randomness and primality testing both have the problem that it's so easy to do them poorly. Generating random primes of these sizes isn't all that difficult, and even proofs can be done in reasonable time frames (e.g. under 10 seconds for 1024-bit inputs). There are also a couple random proven prime algorithms which run pretty fast. Getting software to correctly implement everything .... that seems to be hard.

> Getting software to correctly implement everything .... that seems to be hard.

Exactly. Generating random primes is not terribly difficult in theory, but in practice it is very tricky, which makes it hard to answer the question: how do you know you can trust your keys? Sure, you can verify that your primes are prime, but how do you know how much entropy they have? The only way to figure that out is the audit the code. (And then you have the problem of making sure that the code you're running is the code you audited.)

Generating random numbers is also tricky, but a lot less so than generating random primes: take an entropy source and run it through a whitener, i.e. feed it to sha512. As long as you have a reliable estimate of the lower bound of the quality of your entropy source, you're good. A lot fewer moving parts.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact