Hacker News new | past | comments | ask | show | jobs | submit login
How to factor 2048 bit RSA integers in 8 hours using 20M noisy qubits (scirate.com)
174 points by vtomole on May 24, 2019 | hide | past | favorite | 67 comments

I'm one of the authors on this paper and can answer questions if people have some.

Hi. I would like to take the opportunity to get your opinion on quantum computing.

My understanding is that there is a significant work in theoretical quantum computing done by you and many people. Results presented by you seems impressive. Congratulations.

On the other hand - everything I read, hear and try to interpret about D-wave seems like a nonsense. They show approximation computations and compare it to classical algorithms that provide exact results. But classical approximation algorithms work much better than their quantum computers. I do not see how scaling this approach could lead to anything that you describe in your paper.

In this regard - is my understanding of D-wave correct or does it provide some value that is better than classical computers?

Can d-wave or IBM's creation be used to do computations described in your paper?

Is their approach worth anything?

[edit] Of course I'm not asking about now - my question is whether there is anything connected to exact results (like factoring) and exponential speed-up in d-wave approach.

I don't really know a lot about D-wave to be honest. I defer to Scott Aaronson's posts [1] and to the opinion that the main criteria for success is "solve hard problems" as opposed to "solve hard problems with a quantum computer" [2].

[1]: https://scottaaronson.com/blog/?s=dwave

[2]: https://youtu.be/XbFoXT73xVQ?t=355

Hmmmmm. I tried to follow this youtube, but it doesn't make sense for me. The guy is saying that D-wave would be a success even if it was not a quantum but only solved a hard problem.

I could buy this, but they do not solve any problem yet that was not solvable with probabilistic classical algorithms before.

I would also give them a credit if they didn't solve any problem yet but showed real quantum computer at work. Eg. could do factoring of 50 digit number. Or are on the way to do this somehow and made understandable progress in the area. Even though they didn't fully succeed yet.

What they do to my understanding is:

* approach a problem that is hard in general. * solve only a simple and already solvable case * when asked that it may not be a quantum thay say that even if it was not quantum then ok because the problem is hard (while it is not) * if asked about the problem being easy they say that maybe the problem is easy but this is quantum.

Do I understand this correctly? I would really appreciate more explanation

I agree with all of that.

Without hardware, you're not solving any problems. Glass bead game != solving problems.

At no point in human history as this sort of "detailed theoretical wanking" turned into progress in science; the hardware comes first.

Convnets were detailed 1980s and didn't become viable until hardware caught up in the 2010s.

How do you get the idea to build something without any thought about what it will do?

I was under the impression that theoretical physics detailing underpinning mechanisms proceeded the transistor, nuclear weapons and fibre optics.

That's true of most inventions. We had programming and boolean logic before we had computers. And of course we would, you can't build a boat without water to sail on.

Is your approach here to focus on the theoretical background or is it also to comment on feasibility on hardware in the next x years?

I'm having a hard time parsing your question.

Our goal in this paper was to better understand the resources required to factor with a quantum computer. These estimates can then be fed into decisions about the rate post-quantum cryptosystems need to be deployed. We don't make predictions about the expected capabilities of hardware over time.

We do frame our results in terms of specific hardware assumptions (e.g. particular gate speeds and errors rates), but most of the improvements we make are about how to do arithmetic and so they generalize beyond these assumptions.

Cool, I think that does answer my question.

You do both, but the feasibility part is in the form of hardware assumptions that don't comment on the current or future state of hardware.

Does it mean we can decode 128 bit SSL/HTTPS encrypted traffic in few minutes ?

See title of paper.

The fact that someone would ask implies that the title does not make it obvious.

The time to factor one giant number is interesting, but so too is how this method applies to other, especially more common, problems.

RSA with keys less than 1024 bits long is not considered to be secure against classical attack, so SSL/HTTPS is presumably not using such key lengths. The construction in the paper still takes hours on 1024 bits, therefore it will not break SSL/HTTPS in minutes.

If you were to, very inadvisably, use RSA with 128 bit long keys (maybe that's what they were asking?) then yes it would take minutes. But you can already break such keys in minutes using classical computers.

How plausible is it that someone could acquire this many qubits? How many doublings away are we?

EDIT: [1] suggests D-wave could be 5640 by 2020, doubling approx. every two years - 12 doublings = 24 years from now.

[1] https://en.wikipedia.org/wiki/D-Wave_Systems#D-Wave_2X_and_D...

We are very far away.

We are several theoretical and engineering breakthroughs away from a scalable quantum computer with sufficiently many error-corrected, logical qubits to do something useful with Shor's algorithm. The current rate of error-correction needs to decrease by a factor of 10 - 100. The current state of the art in physical qubits needs to increase by a factor of at least 10,000.

If we assume the number of usable physical qubits doubles every year while the decoherence rate halves every year, it's plausible a quantum computer could be designed to break 2048-bit RSA in slightly under 20 years.[1] To be generous, we'll also assume the implementation and engineering work doesn't meaningfully increase that estimate once the design is finished.

This is an optimistic forecast, to put it mildly. There are credible arguments from skeptics[2][3] in the academic community that it's not actually possible to build a quantum computer capable of breaking RSA in practice. Likewise, everything I've mentioned is only in regard to the known unknowns we need to resolve. It's very probable there are a variety of unknown unknowns to contend with as well.

This is also all aside from the possibility (emphasized by the first report I cited) of a looming winter in quantum computing research. In order to actually reach a point where RSA can be broken, the field needs to start paying out the checks it's been writing. This means actually achieving quantum supremacy and developing legitimately useful quantum computers - scalable or otherwise - for industry applications.

Finally, D-Wave's progress isn't relevant here. They're building a quantum annealer, not a general-purpose quantum computer. All the foregoing is based on the idea of building a general-purpose quantum computer to implement Shor's algorithm. Annealing methods aren't applicable.


1. https://www.nap.edu/catalog/25196/quantum-computing-progress...

2. https://spectrum.ieee.org/computing/hardware/the-case-agains...

3. https://gilkalai.wordpress.com/2017/10/16/if-quantum-compute...

Hopefully we will see results way before we start breaking RSA. It looks like people are targeting applications in quantum chemistry [1]. Also, Dyakonov's criticisms are toothless, see comments here [2].

[1] https://arxiv.org/abs/1801.00862

[2] https://scirate.com/arxiv/1903.10760

Yep, quantum chemistry looks a lot more promising in the near term. Thank you for sharing that critique of Dyakonov, I'll have to give it a read.

D-Wave is a highly specific form targetting narrow problems.

Generalized qubits I thought were smaller.

Interesting to see the doubling, I've been asking for the qubit size "moores law" projection for a while but people in the business keep pushing back saying its a bad measure, but even bad measures back in the MIPS and GIGAFLOPS days made some sense.

Is it really nonsensical to measure how "fast" we're achieveing stable qubits? Feels like it should be a thing.

If you assume a Moore's Law-like increase in physical qubit capacity each year, it will take slightly under 20 years. That also assumes the rate of decoherence decreases commensurately.

The reason you get pushback is because Moore's Law exists within a historical context in which it was actually plausible for exponential increases to occur each year, year after year. The field of quantum computing is so nascent that such a context is utterly alien to it. We simply don't have the economies of scale, engineering capabilities nor even theoretical groundwork required to achieve and sustain those kinds of improvements. The other reason is because transistors and qubits are not directly comparable, and you shouldn't try to infer the growth trajectory of one from the other's.

So to answer your question directly - it's not nonsensical at all to measure and forecast the rate of physical qubit capacity increase. The report I cited in another comment does exactly that, which is how you can derive that 20 year estimate I made. But we don't have any evidence those kinds of annual doublings will be achievable anytime soon, so most of it comes down to educated guessing.

Qubits have not doubled every 2 years historically; I see no reason they’ll suddenly do so ever.

Computers did because process shrank. Most qubits are already 1 or a few subatomic particles. There’s zero feature shrinkage possible at those scales.

It's not qbits themselves that have to shrink, but the support machinery, that measures/cools/traps/shields them. Basically the qbit / area or qbit / volume metrics are the important. Also, dollar / qbit is the relevant factor at the end of the day. (And watt / qbit for operating the whole machine.)

Agreed qubit per area or volume is important, but those are not capable of undergoing a Moore's Law event. Entangled qubits cannot now be packed orders of magnitude closer since they're already at the quantum limits.

Moore's Law did not shrink support machinery much at all. It shrunk the lowermost information processing piece. That lowermost piece in this case cannot be shrunk in the same manner.

The fact the bottom cannot shrink is causing Moore's law to end as it is. There is no way around these quantum limits without a major change in physics, one that is unlikely to ever happen.

None of the pieces you list can scale the same way area did for Moore's Law to work for transistors.

Moore made his prediction in 1965. Transistor area in 1971 was a 10,000 nm process. Modern process is a ~10 nm process. This 1000 fold reduction in linear size (which is a 1 million reduction in area, 1 billion in volume) cannot within any reason happen to your items.

Money and power and count dropped in Moore's law only and precisely because feature size dropped, and this could only happen because the initial features were macro scale, not quantum scale. This cannot happen to qubits - they're already at the end of the quantum line.

We will likely someday get smaller quantum computers. They won't follow a Moore's Law from where they are now, however, for the same reasons nothing other than transistor based items followed a Moore's Law - the physics precludes it.

I'm not an expert on the D-wave machine, but I think it's nowhere near meeting the 10^-3 physical gate error rate criteria specified by the paper. They don't even really divide the computation up into gates; it's an annealer.

The difficulty of scaling quantum computers is that maintaining coherence requires making the noise floor exponentially low, because every qubit has a relationship to every other.

So the quantum equivalent of Moore's law is adding a single qubit every 18 months, which roughly doubles it's speed.

It has yet to be proven that adding one qubit double it's speed. Btw there is no actual hardware implementation of a logic gate with Qubit. I think quantum computing is a big load of bullshit taking a lot of funding that should be redirected at things like fighting ageing research.

More than 500 qubits is enough for shore's algorithm to destroy everything

That is for perfect mathematical qbits. Real qbits are always noisy, and you need at least 1500 real qbits to emulate a perfect qbit with reasonable certainty, putting your number in the same ballpark as the number of 20M noisy qbits from the article.

So they can use shore's algorithm?

*Shor's algorithm

don't forget error rate metric too

I haven't finished reading the paper, but maybe my biggest immediate takeaway is that significantly increasing the number of bits doesn't really help. Using the methods discussed in the paper, 4096 bit integers can be factored in less than 4x the amount of time needed to factor 2048 bit integers. In other words, should this method become practical, cryptographers couldn't solve the problem by using integers we would ordinarily consider much harder to factor.

It's been well-known for a long time that doubling the key bits doesn't help for current public-key cryptography because you end up only requiring (approximately) double the number of qubits and a sub-exponential time increase. After all, Shor's algorithm provides a polynomial-time algorithm to solve the problem so the worst case is that doubling n will cause a fixed-power-of-two increase in time (it turns out to 2^2 in this case). It's interesting to get estimates of difficulty given noisy qubits rather than the estimates given of running Shor's algorithm on perfect qubits.

However, symmetric cryptography like AES or ChaCha20 (assuming that the best-case quantum attack is a brute-force search) only gets a square-root speed-up and thus doubling the key bits actually does alleviate the quantum threat.

Yes, that is why djb‘s post quantum RSA proposal uses public key sizes on the order of one terabyte (i.e. 8 trillion bits) instead of the more common 4096 bits. I have the slight suspicion that the proposal is slightly tounge in cheek, though.

See https://cr.yp.to/papers/pqrsa-20170419.pdf

Yeah, that submission was absolutely a joke. It didn't get passed to round 2 of the NIST PQCrypto Standardization CFP. Bernstein and Lange intended that to be a bit of sardonic humor poking fun at the review panel.

That proposal was meant to be taken as a joke. The other proposals by djb and Lange (or possibly CSIDH -- the closest proposal to a usable post-quantum Diffie-Hellman -- which didn't make the submission deadline) seem far more reasonable.

Yes, the time scales quadratically in the key size while the space scales linearly (times a bit of logarithmic overhead on both). I think of the space as the limiting factor, since an attack that takes months to run is still worrying to cryptographers.

4x the amount of time, but what x the amount of qbits?

My very vague impression of quantum computing is that making a quantum computer with more qbits is quite hard.

Good question. It looks like the rate at which more qubits are needed is slightly less than that for the time requirements. (So still under 50M qubits total, vs 20M for 2048 bit integers.)

They also say later on in the paper that, given certain constraints, it should be possible to use multiple quantum computers with fewer qubits per computer - at the cost of a slight increase in the total number of qubits.

So it would only take 10,000 D-Waves? That doesn't seem right.

I think D-Wave systems are all annealers, while the paper's about gate-logic.

This paper may be relevant to your interests: https://cr.yp.to/papers/pqrsa-20170419.pdf

TL;DR: yes, just moving from 2048 to 4096-bit keys will not suffice. But, if you make a composite key out of many 4096-bit keys (such that the key file is now ~1Tb) then then can achieve a reasonable level of security. And yes, I do think the paper is (at least in parts) somewhat tongue-in-cheek!

Surely if you are even considering keys that are 128GB, then just use one-time-pad which is completely unbreakable by anything?

One time pads can't do public key cryptography

So RSA should be considered broken?

RSA has known to be broken by Shor's algorithm since the 90s, as well as Diffie-Hellman and elliptic-curve Diffie-Hellman. The only question is how practical is an attack going to be, and how much time do we have to switch to post-quantum algorithms. NIST has been running a competition for post-quantum cryptosystems since 2017 (approximately 10 years late to the party) and we'll see where we end up.

It doesn't look like there is even a need for quantum proof algorithms. An Epyc CPU with 32 cores has 20 billion transistors. Even if we can build qubits as tiny as a transistor you won't be able to break RSA with ridiculous key lengths above 2000000 bits. In reality qubits will be bigger than transistors, not all quibits can be active at the same time and there won't be the same exponential growth in the number of qubits as we have seen in conventional computers.

Relevant presentation on real world cipher reversing with quantum computing:


Slides: https://speakerdeck.com/rlifchitz/ordinateurs-quantiques-et-...

How likely is it that for breaking 2048 bit RSA integers you'll just need 2048 unnoisy qbits?

That this lowered 20M requirement is just ploy to keep people using 2048 bits, not 4K as required by GNU. It is my understanding that 2048 qbits are already in reach to well-funded state agencies.

So, is there some alternative quantum-resistant cipher ready (i.e. with open source implementation, say on github) that we can use today to encrypt out long-term secrets?

AES-256 is believed to be quantum-resistant. The only known quantum attack on AES-256 is utilizing Grover's algorithm, which is an universal approach.

As for assymetric crypto, the ones we have now are susceptible to quantum attacks. There are a bunch of proposals for quantum-proof algorithms but nothing officially standardized yet. There is a NIST competition going on right now that is trying to find a new suite of assymetric cryptographic algorithms [1]. You can find an awesome talk on the topic here [2].

If quantum computers are possible, we should switch to quantum proof algorithms now rather than later, because then we'd reduce the traffic that can be decrypted at a later point in time. It's quite dangerous right now considering that basically everything is protected by quantum-non-resistant ciphers.

[1]: https://csrc.nist.gov/Projects/Post-Quantum-Cryptography

[2]: https://www.youtube.com/watch?v=ZCmnQR3_qWg

A one-time pass is still uncrackable.

Take your secret. Write it on a piece of grid paper. Fill EVERY FIELD on another piece of grid paper with random data. Transform every character on the first piece of paper based on the corresponding characters on the second piece of paper and write the result on a third piece of grid paper. Then burn the first piece of paper.

It's not possible to read the 3rd piece of paper without the 2nd now. You also obscure the length by filling the second piece of paper with data.

One-time pads are extremely impractical and error-prone. You need:

1. A secret key as long as the plaintext.

2. A consistent source of true randomness and a way of sampling it such that your secret key is truly random.

3. To never reuse a key once it's been used once.

Imagine the ramifications of retrofitting servers to use one-time pads for TLS. Moreover, essentially everything we take for granted in cryptography relies on constructions which use pseudorandom permutations and generators. Even if we resolved all these problems and forged ahead in a brave new world using stream cipher-like constructions based on one-time pads, we'd still have to rethink all of public-key cryptography.

This impracticality is one of the major reasons we moved on from information theoretic security to complexity theoretic security by the mid 20th century.

For what it's worth, quantum computers might give a provably correct source of 2 (random numbers).

The version I saw (requires 2 quantum devices): http://www.henryyuen.net/fall2018/scribe2.pdf

New paper that claims to do it with only one quantum device: https://arxiv.org/pdf/1804.00640.pdf

Sure of course its incomprehensibly impractical.

There is a variety of post-quantum public-key cryptography. These cryptosystems are based on a few different intractibility assumptions: multivariate polynomials, lattices, error-correcting codes and supersingular isogenies.

Code is available for most of these proposals because NIST is currently running a PQCrypto Standardization CFP[1]. However I strongly recommend against deploying your own post-quantum cryptosystem for a few reasons:

1. As I've cited elsewhere in this thread, this paper notwithstanding, most leading researchers in the field are frankly bearish about the prospects of 2048-bit RSA being broken in the next 20 years or so.[2] This can obviously change, which leads me to the next few points of consideration.

2. Most of the post-quantum cryptosystems are not well-studied, relatively speaking. They are fundamentally unproven compared to classical systems - we just haven't had enough academic scrutiny yet. It's still premature to say which post-quantum cryptosystems will remain secure under academic scrutiny over the next few years. Many were dropped from consideration after Round 1 of the NIST standardization review.

3. Most (perhaps all) of the implementations are immature and not well-supported. The majority of them are proofs of concept for NIST, not production-ready code. Authors themselves will caution you against using them. We don't have a libnacl for post-quantum public-key cryptography right now, which means that you'd be substantially rolling your own interfaces to underlying primitive implementations. It's hard enough maintaining secure cryptography in production when everything has been done to keep you from footgunning yourself - you won't have such guardrails for post-quantum cryptosystems.

4. Unfortunately, all post-quantum cryptosystems are grievously inefficient in either time or spatial performance compared to classical cryptosystems. As a general rule of thumb, lattice and error-correcting code based cryptography tends to be on the faster side with very large key requirements, and isogeny-based cryptography tends to be on the slower side with lower key size requirements. But all are noticeably slower than classical systems across both dimensions.

You should wait until these cryptosystems have been proven out by academic and industrial research. Google[3] began implementing lattice-based cryptography for TLS in Google Chrome in 2016. Adam Langley has a nice writeup[4] which also includes a few performance concerns. He's also written a blog post to talk about the next round of implementations they'll start experimenting with[5].


1. https://csrc.nist.gov/Projects/Post-Quantum-Cryptography/Pos...

2. https://www.nap.edu/catalog/25196/quantum-computing-progress...

3. https://security.googleblog.com/2016/07/experimenting-with-p...

4. https://www.imperialviolet.org/2018/04/11/pqconftls.html

5. https://www.imperialviolet.org/2018/12/12/cecpq2.html

So RSA has some 10-20 years to go. Is there a quantum computing-proof encryption algorithm, or is it just a matter of increasing key lengths?

Many old, hard problems are still hard even on quantum computers (SAT, TSP, etc. cannot be solved in polynomial time on quantum computers). Factoring can be done in polynomial time on quantum computers. FFT can be made even faster. But again, some classic, hard problems still stand. So don't worry, quantum computers are not magic solutions to all hard problems, just some.

> Is there a quantum computing-proof encryption algorithm

This is still an area of active research. For instance, there are learning-with-errors-based cryptosystems: https://en.wikipedia.org/wiki/Learning_with_errors#Use_in_cr... But such systems are based on the assumption that "this type of problem is probably hard even with quantum computers". I believe we don't have anything more "quantum-proof" than that..

And of course, there are cryptosystems with unconditional security. Such as OTP.

Is this the long fabled quantum computing application that breaks asymmetric cryptography?

Its the continued advancing of the craft. Some newer techniques but an evolution of the attack. Now all one needs is a 20 Million qubit QC..

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact