Hacker News new | past | comments | ask | show | jobs | submit login
Schnorr confirms paper is his, claims it “destroys RSA cryptosystem” (twitter.com/fredericjacobs)
236 points by EthanHeilman 73 days ago | hide | past | favorite | 72 comments



In the 2020 preprint:

"These new algorithms factor integers N ≈ 2^400 and N ≈ 2^800 using 7x10^10 and 4.3x10^12 arithmetic operations".

He does not specify how many it uses for 2^1024 or 2^2048. This is clearly nonlinear. Assume (nevertheless) that you multiply x100 every 400 binary digits, then for

2^2048, you get

(2048-800) equiv 1200/400 equiv 3, so you would need approx. 10^12*10^6 operations. Assuming you can do 10^6 ops , you need 10^12 seconds (like 30000 years).

BUT those are very very rough assumptions (an "arithmetic op" might probably take more than 1 microsec).

EDIT: sorry, this link is pretty interesting. Factor RSA-260, which only has 862 bits. Should be feasible in about 2 hours. The link: https://crypto.stackexchange.com/questions/88582/does-schnor...

EDIT2: sorry again: Schnorr is 78 years old. I am not gerontophobic (being 50 I am approaching that age) but: Atiyah claimed the Riemann Hypothesis, Hironaka has claimed full resolution of singularities in any characteristic... And I am speaking of Fields medalists.

So: you do really need peer-review for strong arguments.


I'm getting 1e9 multiplication (2.5e8 div) ops per second in Node on a 2018 Macbook pro. So that would be 30 - 120 years. Not sure how well this algorithm can be parallelized, but since it's factoring, I think it can. So I would think that a dedicated attacker could get at least another 1000x on that, which gives 11-44 days. Note that I am pretty much guessing here but this is my test code:

    function test(opp, n) {
      const start = Date.now();
      let i = 1;
      if (opp === '*') {
        let prod = 1;
        while (i < n) {
          i++;
          prod *= i;
        }
      } else if (opp === '/') {
        let quot = 100000000000000;
        while (i < n) {
          i++;
          quot /= i;
        }
      } else if (opp === '+') {
        let sum = 0;
        while (i < n) {
          i++;
          sum += i;
        }
      } else if (opp == '-') {
        let sum = 100000000000000;
        while (i < n) {
          i++;
          sum -= i;
        }
      }
      const end = Date.now();
      console.log(`${n} ${opp} ops in ${end - start} ms`);
    }


Guess: The "multiplications" we're talking about are those of 400-bit or 800-bit numbers (and, in general, would be the size of the number you're trying to factor), which are much more expensive than 64-bit x 64-bit or float x float multiplications. If you want to test the cost of that, I recommend GMPlib's bignums (or possibly just your favorite language that has built-in bignum support, although if it's not built on GMPlib it might not be using the most efficient algorithms).


If one takes a good FPGA and implements a 4096 bit multiplier in it, would it be faster?

Or, if we take a GPU, can we split a 4096 bit number into, say, 32-bit fragments, mass-multiply them, and combine the results faster than on a CPU? I suspect pretty common hardware can help speed such things up a lot; isn't crypto mining already using some of these approaches?


That I don't know. It's probably first worth looking at what software can do: https://en.wikipedia.org/wiki/Multiplication_algorithm#Karat...

Naive multiplication is O(n^2) (where n = number of digits). The fastest algorithms seem to involve either cutting the numbers into pieces and doing multiplies, adds, shifts, and maybe subtracts on the pieces, and doing so recursively, which are O(n^[something slightly greater than 1]); or doing Fourier transforms, which apparently approach O(n log n). I don't know how easy it is to implement pieces of these more advanced approaches in hardware (especially if the size of the numbers isn't pre-chosen).


Yeah that makes sense.


The two links to the paper...had to navigate twitter a bit to find them:

https://eprint.iacr.org/2021/232.pdf (older)

https://www.math.uni-frankfurt.de/~dmst/teaching/WS2019/SVP9... (newer)

Edit: And the abstract with the "destroyes the RSA..." blurb: https://eprint.iacr.org/2021/232


Quick summary, he's found a faster way to factor numbers. He hasn't found a way to break RSA that isn't equivalent to factoring the modulus.

(As far as I know, it's still just a long-standing conjecture that breaking RSA is as difficult as factoring. RSA always uses odd exponents. The Rabin cryptosystem is similar to RSA except that it always uses 2 as the public exponent and is provably as difficult as factoring, but if the modulus has 2 prime factors p and q, then by the Chinese remainder theorem, the output will always be a quadratic residue modulo p and also a quadratic residue modulo q. In other words, the number of possible plaintexts is 4 times the number of possible ciphertexts, so decryption gives you back 4 possibilities and you need some convention that only one of those was a legal message.)

Unfortunately, this is a faster factoring method, so it also applies to the Rabin cryptosystem, the Blum-Blum-Shub pseudorandom number generator, and Rivest's time lock puzzles (repeated squaring modulo a large composite).


> the number of possible plaintexts is 4 times the number of possible ciphertexts, so decryption gives you back 4 possibilities and you need some convention that only one of those was a legal message.

This condition is satisfied by ~100% of all communications, including stuff like software downloads. (If I send you a file, then yes, any message is legal, but as soon as you try to do anything with it, you'll know whether the message was correct or not.)


More importantly, Rabin would be used in a hybrid cryptosystem, including message authentication codes, so you could calculate MACs using all 4 possible keys. Though, it would be much lower overhead and safer to use OAEP[0] to encode the symmetric cipher/MAC key(s) and then encrypt with Rabin. There's very low probability that more than one of the 4 possible plaintexts has all zeroes for padding after OAEP decoding.

[0] https://en.wikipedia.org/wiki/Optimal_asymmetric_encryption_...


Seems like not a fault of the construction of RSA or anything else that relies on the cost of factoring, but a fault of appropriately anticipating the costs to break it. Useable quantum will also make crypto security more expensive and complicated.

Another thing that often bothers me are claims and emphasis of how fast constructions are. This just makes it easier to brute force.


> Another thing that often bothers me are claims and emphasis of how fast constructions are. This just makes it easier to brute force.

People generally talk about speed of encryption and decryption. You're complaining about time for bruit-force attacks. In general, there's not a linear relation between the two, and the time for initial TLS session setup matters. If it takes a year to hit amazon.com for the first time, how many people are going to visit amazon.com?


> People generally talk about speed of encryption and decryption. You're complaining about time for bruit-force attacks.

If it takes me k time to decrypt something where there are N key variations and where I can run p attempts in parallel, the maximum runtime for a brute force attack is k*N/p. Perfectly linear relationship.

This is why we want large key spaces, and why algorithms needing offline brute force protection like password hashing algorithms artificially increase execution time and resource requirements to very large numbers.


> This is why we want large key spaces

Yes, but large key spaces are really easy to have. The size of the key space doesn't cost you anything when you're encrypting or decrypting, but it costs the earth if you're trying to guess the key.

> and why algorithms needing offline brute force protection like password hashing algorithms artificially increase execution time and resource requirements

Ehhhh... this is more of a function of the fact that the password space is much smaller than it looks. Password cracking attempts generally aren't trying to exhaust the space. Instead, they're trying to guess the password based on the known properties of passwords. You start with common passwords and work your way down to iffy ones. You don't bother guessing rare passwords; there are too many of them.

Rainbow tables exist, but they have sharp length limits, precisely because of the explosion-of-the-key-space phenomenon.


> This is why we want large key spaces,

This is a subset of what I'm talking about. Encryption/decryption times aren't linear with key space sizes. Force the number of required parallel instances to exceed the number of atoms on Earth, and bruit force time rapidly diverges from being linear with encryption/decryption time.


Nit: It's spelled "brute." "Bruit" is a rumor.


Strawman. There's always a legitimate user cost to attacker cost ratio, and also scaling the bike lock cost and durability to the cost of the bike.


>> As far as I know, it's still just a long-standing conjecture that breaking RSA is as difficult as factoring.

No, that's not a conjecture. If you factor the modulus you can directly calculate the private key from the public one.

Edit: oh maybe you mean that it's possible to break RSA faster without factoring... Its been too long since I looked at that.


> No, that's not a conjecture. If you factor the modulus you can directly calculate the private key from the public one.

You have misread my post. The conjecture is that the RSA problem is as difficult as the factoring problem. In other words, it's possible that solving the RSA problem isn't as difficult as the factoring problem (but most people doubt it). A <= B. Most people think A == B.

This is the inverse of your statement. Factoring breaking RSA doesn't tell us if there's a non-factoring solution to the RSA problem. Everyone knows breaking RSA is no more difficult than factoring, but it may be easier than factoring.

A break in the Rabin cryptosystem would provably break RSA, but it's not necessarily true that a break in RSA would break the Rabin cryptosystem.

Edit: Paragraph 4 of https://en.wikipedia.org/wiki/RSA_problem at least believes it's still a conjecture that RSA is as difficult as the factoring problem.


Yeah, that's what I tried to say with my edit.


It doesn't destroy it...at best it weakens the lower RSA bit keys. If it destroys it...why not prove it with a PoC on one of the RSA numbers that hasn't been factored yet. RSA-260 is still open...


Ron Rivest explains himself that once in a while he gets an email from someone saying basically "I broke RSA": not everyday but it's a semi-regular occurrence. And he always nicely answers by asking the person claiming to have broken it to solve a challenge (and it's not even a crazy high number of RSA bit keys he gives).

Theory is nice but if you can find the solution to something, just do it and then brag once it's done : )


Way back when dinosaurs roamed the earth and comp.compression on Usenet was a place to be, usually we went beyond the pigeonhole principle by asking people to grab a megabyte from random.org, compress it, and then get back to us.

Nobody ever did.


I remember one of those exchanges, two people waged money about compressing random data so that the data + decompress program would be smaller than the original data.

The other guy came back with a program that moved some of the random data into the metadata of the filesystem, in such a way so that the produced files did indeed become small enough.

Don't know if the other guy ever paid out.


When I was a kid I found some compression program on a BBS, that claimed to use "wavelets" to compress even more than was previously thought possible. It turned out it was using a similar trick (but this was in MSDOS, so it was just hiding blocks in unused parts of the disk and storing their addresses in the "compressed" file). I blithely used the tool on some things I had on hand that were taking up too much space, and predictably lost ~all of them.

I learned a little bit about information theory that day, and my sneakernet bandwidth was saturated for a while afterwards.


He did not,

full story, for the perspective of the metadata guy is here

https://www.patrickcraig.co.uk/other/compression.htm

Excellent computer story, on a par with "We can't send email more than 500 miles" IMO.


I think I remember that one too, or definitely one like it.

That's a cheap trick. You gotta send someone the file, otherwise there's not much of a point.


Isn't this a bit disingenuous though? If somebody had designed an algorithm capable of breaking RSA given supercomputer-level resources, that's still breaking RSA even if they may not be able to provide ironclad proof.


Pffh, that's for lowly applied mathematicians. This is theory! /s


On one hand I am skeptical about strong claims especially in the early stage of the claim. On the other hand Schnorr is the person claiming it and no one has refuted it yet. I wonder how long until we will know for certain.


Refuting it puts a massive burden on other people since finding the fault in complicated proofs can be very involved. Providing a PoC puts a burden on the person making the claim. So at a minimum not providing a PoC is a very selfish approach.


The paper seems to be built off of the work of a lot of previous research. It will take some time to prove it, but again if Schnorr is so adamant that this is true he should have no problem with a PoC...even a simple one.


Not sure I would agree that it would be "no problem" to do a PoC. Going from paper to PoC might take a long time. If you are very sure of a result, better to get it out early and let others look at it.


Hironaka has claimed to have a proof of reduction of singularities in positive characteristic for years (he got the Fields Medal for the same result in characteristic 0).

Nobody has read the manuscript though...


From the look of it, he hasn't really claimed this yet in public. This link is someone on twitter saying he did in private. I would just wait


He claimed it in the paper he posted on eprint.

"This destroys the RSA cryptosystem" - Claus Peter Schnorr[0]

[0]: https://eprint.iacr.org/2021/232.pdf


But is this a final released copy?


It is in the ePrints abstract, which is essentially public.


Agreed. Typical mathematician mentality

People would be more convinced by presenting a factorization of a challenge RSA number


As a computational mathematician, I must object. The world is large enough for multiple deep specializations. I can't think at this level of abstraction (hence, I quit number theory after bashing at it for a few years). I can, however, read a math paper and put together an implementation as long as I'm familiar with the notation and the axiom of choice hasn't been used in a nontrivial matter. But then, my implementation would probably piss off software engineers who like to castigate research software that doesn't meet their standards (rather than, say, helping out).

Expecting everybody to have mastery of every specialization that their work touches is toxic and unproductive. I look forward to a team of motivated undergrad number theorists to tear into this and hack up an implementation.


Count me in!


Without experimental data to back it up (e.g. implementation and benchmarks), I'd consider this claim veeeery dubious - regardless of who it came from.

Extraordinary claims need extraordinary evidence, and in this case it would be easy to provide such evidence - by cracking appropriately-sized challenge primes in a transparent way that can be independently verified.

On the other hand, the theoretical approach in the paper is quite complex and hard to follow - even for professional cryptographers.


There are some approaches that seem difficult to implement at first view, but that can be improved when more people work on the difficult pieces. The most important thing in a paper like this is the general strategy. Even if there are small issues or if the implementation is too complex, if the general ideas are right we can find a better way to solve these problems over time.


Depends of the claim. For the theoretical result, the evidence is the paper. (I have no clue whether it’s correct.) Nothing else is needed. For “RSA is broken”, it depends what broken means. If it would rely on a claim that is now provably false, it is indeed broken. If we are talking about the claim of breaking RSA in the real world, I agree with you. But it all depends on the claim.


"This giant prime number will keep your information totally unreadable to anyone who doesn't have this other giant prime number" is no less of an extraordinary claim, just one that we all believe in (myself included).


If my understanding is correct, Schnorr converts the integer factorization problem to lattice problems: SVP and CVP. Then he claims he had an efficient algorithm to solve this particular instance of SVP and CVP, thus RSA is destroyed. But, the hardness assumption of SVP and CVP in general is the very foundation that another branch of public-key cryptosystems - lattice-based cryptography - is built upon. So, if Schnorr's claims are true (uncertain and still to be determined), I can't stop to think about its impact on lattice cryptosystems: Can we use the same technique to attack them? I think it's a more important question than RSA [0] - lattice cryptosystems are the candidate for Post-Quantum Cryptography, meant to replace all of the existing public-key cryptosystems of today (e.g. RSA and ECC). Even if the attack is purely theoretical, if it can solve other instances of SVP and CVP, it'll certainly affect the security assessment of lattice-based PQC (e.g. NTRU, also LWE).

Of course, I don't know what I'm talking about.

[0] Factorization had a history of speedups, both theoretical and practical. It doesn't really affect the practical security but always comes at a cost of either decreasing confidence or constantly increasing the keysize, so I won't be too surprised if Schnorr really has new insights to speed it up further. In fact, "We need something with a better security record than RSA" was one of the main arguments for transitioning to ECC - which has already completed at large on today's Internet, RSA is only used for digital signature, almost all key exchanges are ECC now. You can't decrypt post-2016 web traffic by breaking RSA.


Schnorr got close to making such a claim in a previous version of the paper [1, Section 6]. Namely, that NTRU is close to being broken if a sufficiently short vector (it is not specified how short) is found.

The main claim of Schnorr's, as far as I can gather, is that for lattices where the shortest vector(s) are much shorter than the maximum shortest vector(s) for the same dimension, i.e., low-density lattices, those vectors can be found in heuristic polynomial time with his enumeration approach.

Now, low-density lattices are fairly common in cryptographic settings, where the solution, discoverable by finding shortest vectors, is unusually short/close relatively to what one would expect in a random lattice.

As such, if true, I would expect Schnorr's idea to lead to more breaks than just RSA. But I don't personally think it's true. At the same time, I'm not a lattice expert, so make of that what you will.

[1] https://www.math.uni-frankfurt.de/~dmst/research/papers/SVP2...


I noticed HN user fractionalhare has made a comprehensive reply to my questions, but it wasn't posted as a reply to me, so now it has fallen to the bottom of the page. You can see it here: https://news.ycombinator.com/item?id=26331747


> Of course, I don't know what I'm talking about.

I would honestly have no way of knowing either way. It is always fun to get a peek into people working out ideas in fields I have no experience in.


> It is always fun to get a peek into people working out ideas in fields I have no experience in.

Phew. I'm not alone.

I understood _some_ of the words used in the HN comments in this thread.


See previous discussion for context: https://news.ycombinator.com/item?id=26321962


About 10 minutes ago an updated preprint was published with a much more succinct and understandable abstract, that includes the sentence, "This destroys the RSA cryptosystem".

https://eprint.iacr.org/2021/232.pdf

Version history here: https://eprint.iacr.org/eprint-bin/versions.pl?entry=2021/23...


Here's an expert saying that this paper should be considered unsubstantiated at best: https://crypto.stackexchange.com/a/88590


Would love a "what's going on here for dummies" summary if possible. From what I gather someone is claiming they broke RSA encryption?


1. Claus-Peter Schnorr, a well-established and reputable cryptographer, publishes a paper and claims he found a new algorithm for integer factorization, which is faster than the best known algorithm today (Number Field Sieve).

2. If it's all the paper says, it probably won't attract that much attention, at best it leads to a new wave of RSA keysize upgrade (or transition to ECC). But today, this paper appeared on the Cryptology ePrint Archive, and its abstract reads "This destroys the RSA cryptosystem", the use of strong language is extraordinary, a reader may interpret it as "the speedup is significant and all RSA keys can be broken." Researchers usually don't make such claims.

3. Meanwhile, it's also very suspicious for two reasons. First, this claim didn't appear in the actual paper, only the ePrint Archive web page, and it also includes an embarrassing typo. Also, the submitted paper wasn't even the latest version, which is available at Schnorr's web page, but an old version from 2019. Many people suspected that the paper was submitted by someone else, who happened to see an earlier version on the web and got too excited about it, and added the "destroys RSA" claim.

4. But now, personal communication with Schnorr confirmed the paper was indeed submitted by him, and he indeed makes the "destroys RSA" claim. Schnorr also said he uploaded the wrong file.

5. About 10 minutes ago, an updated preprint was published, that includes the sentence, "This destroys the RSA cryptosystem".


I wrote my thesis on post-quantum public key cryptosystems, though I'm not currently a practitioner and haven't been academically active since ~2017 or so. With those provisos in mind, I'll try to give the summary as I see it. As a basic tl;dr I'm skeptical of the paper. Crypto-twitter is lit up about this right now and as far as I can tell, harbors the same skepticism.

- For context and background: yes, there are well-established links between factoring problems and lattice problems. These links have been known from at least the early 90s. There are many complexity theoretic reductions between specific factoring problems and specific lattice problems, and the approximate variants of the latter. There hasn't been an arbitrary reduction yet, it's mostly on a problem by problem basis.

- That first bullet point means that this research is at least structurally built on, and engages with, prior work in the academic community. Schnorr in particular has been pursuing this since the 90s. Schnorr is an accomplished and established cryptographer.

- I consider the context of that second bullet point unfortunate, because it means the research gets outsized attention even though it is otherwise, as of now, unsubstantiated. It gives the paper a lot more charity than it would ordinarily receive.

- Critically: Schnorr has not empirically demonstrated a break in RSA. He has demonstrated - in theory - faster factoring methods using SVP and CVP solving techniques which rely on a reduction of the factoring problem.

- The paper may not be worthless even if it doesn't break RSA. If he has indeed found a polynomial time way to solve a subset of lattice (and factoring) problems, that will be impressive. I'll have to read the paper a few more times to come to a belief on this point though.

Many comparisons are being drawn online between Schnorr and Atiyah, because the latter kept insisting he found a proof of the Riemann Hypothesis towards the end of his life. It would be sad if this is the case for Schnorr, but it's personally what I believe at this time pending an empirical demonstration of his work and/or critical substantiation from the rest of the academic community. I'm skeptical of this result the same way I'm skeptical when highly established mathematicians publish purported proofs of long standing open problems.


Assuming it works, I'm interested in its limitations. At various complexity levels, there are subsets of problems that yield easily, like unsafe primes in RSA, or trivially, how I can instantly get one factor of an even composite.

Schnorr's main novel claim here seems to be a speedup in finding the SVP and CVP in some cases (he explicitly acknowledges limitations.) A Proof Of Concept seems like it would be great to test for edge cases, and that's where I think the interesting bits are likely to be. Disclaimer: Not a mathematician or complexity theorist. Just here to learn, and glad to be corrected any time.



I really hope this is not just another Atiyah moment. Sigh.


It is an Atiyah moment in the our response to this: both men earned the right to have those publications silently ignored, not being made into an attraction.



*according to someone on twitter.


If Frederic Jacobs says he communicated with Schnorr, then that's what happened.


If Frederic Jacobs says he communicated with Schnorr, he thinks he communicated with Schnorr. It's possible that someone is impersonating Schnorr.


If only there was some kind of way of proving who you were... possibly using public key cryptography? ;)


Elliptic Curve Cryptography still has an excellent theoretical security record, and will likely keep its record until the advent of quantum computers. Also, trustworthy digital signature is actually not too difficult even given the doomsday scenario - if all public-key cryptosystems are broken, as long as you still have a secure hash function, you can use Merkle signature [0].

[0] https://en.wikipedia.org/wiki/Merkle_signature_scheme


This only works for alice and bob.


This is mildly important. It's someone that claims to have communicated with Schnorr.


I take it someone finally put down the smartphone and picked up the telephone? It turns out retired 70+ year old math legends don't frequent Twitter, so of course the chances of the "crypto community" hashing this out through ever increasing frantic Tweets weren't great.



What if it's actually someone else that discovered this and is just publicly posting and acting as Schnorr, in an attempt to prove their point? ;)


.. this is literally a post confirming that it was schnorr




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: