TLDR; Several experts in the field have published an academic research report on the progress and made the following statement:
Given the current state of quantum computing and recent rates of progress, it is highly unexpected that a quantum computer that can compromise RSA 2048 or comparable discrete logarithm-based public key cryptosystems will be built within the next decade.
This echoes my thoughts on the matter. I'm weakly pessimistic I'll live to see RSA 4096 broken in my lifetime. I'm also weakly pessimistic I'll see RSA 2048 broken in the next two decades.
It's a funny place to be in for me, because I'm generally skeptical the majority of my research will be practically useful in any meaningful way in the near term :)
I do some QC research. With the need for error correction, cracking real world encryption will require millions of qubits and that’s unimaginable from where we are. I’d be surprised by 20 years and impressed by 50.
Note however that cracking crypto is unlikely to turn out to be super exciting - really secure systems have moved on and there will be plenty of time to upgrade before QC is widespread.
There are exciting applications of QC that are less than a decade away, eg quantum simulation.
The problem is that if you assume your encrypted traffic is being collected and stored today, and you want it to stay secret indefinitely, and you anticipate quantum computers will break discrete-log-based crypto in 20 years, then you need to deploy a post-quantum cryptosystem today.
This is a problem we face training activists and journalists in a number of countries. Esp places like China, Russia etc. We have to work hard to help people get the idea that a lot of the encrypted, password protected stuff etc they send now is potentially at risk in 5-10+ years. To a certain extent (but different context obviously), the lessons of the Venona Project comes to mind.
IIRC that's one of the big things the NSA's data center was doing - storing Tor traffic with the assumption QC could break the public key crypto and either see where it was going/coming from or maybe even what was being sent (traffic itself)?
Take what I say with a grain of salt, I'm not a cryptographer.
It's my understanding the onion routing (what went where) could be cracked since it uses public key. The data itself may be fine because it uses private key, but if you sent the private key using a public key then you may be burned (IIRC there's a protocol where you send the symmetric key using public key crypto then fall back to private key since it's faster)
That’s the way basic research works, across the sciences. Working on an interesting problem? It might be useful to someone in twenty years. If you’re lucky.
My TLDR- We don't have any clue on how quantum mechanics function other than some guesses so we are not in position to predict anything about its future.
Science might as well believe in God if its backing any logic to explain QC.
This report focuses so much on gate-based computation and says nothing about measurement-based schemes other than "they have attracted substantial interest". This is a massive omission and frankly undermines the entire report. This is disappointing given the prominence of the authors.
Measurement-based quantum computation is a special case of gate-based computation, as a model. Do you mean that the report does not give enough emphasis to photonic quantum computing experiments? For better or worse, this is largely an accurate summary of the current state of experimental physics approaches; there's been progress but LOQC is still a dark horse. I don't think it undermines the report.
Ironic that news about "quantum computers" which so often overlook other schemes will often be accompanied by an image of a D-Wave chip and sample holder...
In fairness; breaking real crypto on D-Wave's hardware is still a very very long way off -- IIRC factoring 16 bits is plausible but still quite optimistic
You're right, people are familiar with ideas of optimization and logic gates, which are classical analogues of adiabatic and gate-based quantum computation, respectively. Measurement based computation is not so intuitive, especially because it relies on large entangled states (where you have a lattice or graph of many qubits selectively entangled with their neighbors), whose behavior is complex.
Glad to see this but it doesn’t eliminate the issues surrounding how many less sophisticated encryption algorithms have been used.
The millennium bug got fixed only because we put money into solving it. That kind of emphasis needs to be put behind addressing the forthcoming encryption crisis.
I think that it is expected that these can / will be brought into use well before QC becomes widely used and therefore attackers will only have the opportunity to decrypt public key traffic that is many years old. Very short keys can already be attacked using GPU clusters, and it is clear QC will be limited in power for a long time, so traffic encrypted with long RSA keys will be safe for many years.
Should extremely powerful QC be rapidly available, and should the public key quantum safe stuff be unworkable for some reason (for example very high speed transactions) then Quantum Key Distribution will provide an infrastructure to protect important traffic (financial institutions, markets, defence). The point here is that symmetric keys which are rather hard to break but fast to use can be distributed securely and rapidly and so can be changed very frequently meaning that attackers armed with fantastical QC would still have to spend vast resources to have any chance of reading traffic.
I think that the moderate resources being employed today are a good hedge for society dealing with this issue.
> Quantum Key Distribution will provide an infrastructure...
No. Quantum key distribution is a fugazzi. It's the red headed step child of actual cryptographic research in the quantum setting. Using quantum key distribution as the basis for a PKI would be ridiculously brittle and over-engineered. Note that the vast majority of interest and research on it comes not from cryptographers, but from physicists who are seeking a cryptographic problem to solve.
It's like saying we should abandon the computational complexity theoretic model of cryptography and go back to the information theoretic model. Basically, it's fairy dust sprinkled onto a one time pad. In modern cryptography the one time pad is (approximately) never used because pseudorandomness is sufficient. Key distribution is not a problem which requires quantum computers to solve. Orchestrating a key distribution ceremony using computers that must extraordinarily well calibrated and synchronized using quantum entanglement would be inefficient and asinine, to put it bluntly.
Which detect photon's one at a time (also there is spitting out of photons via lasers) The idea is that you have an infallible alarm for when the optics have been tapped, which means that you can then send lots of symmetric keys and know that they can't be intercepted (provided you are only sending them ~50km. So it's nothing to do with cryptography (really) either - which explains why cryptographers aren't that interested in it.
Also it's not a fugazzi as there are several working testbeds in the wild.
A system for providing assurance against tamper detection during key distribution is pretty clearly germane to cryptography. Cryptography already provides very mature solutions for tamper detection. Like I said, information theoretically secure tamper detection is over-engineered and unnecessary.
QKD doesn't actually help with initial key distribution and you wouldn't use it for that anyway, so you are arguing against a straw man.
What it does is let you expand your pre-existing shared keys in a provably secure manner. And it might be easy if you use satellites, like the Chinese are doing.
I'm not arguing against a straw man, I'm outlining why it's unnecessary (in very blunt terms). What you've just said does nothing to refute everything I stated.
"Provable security" is both rare and unnecessary in most of cryptography (in the information theoretic sense relevant to this discussion, not the sense of e.g. application security or static analysis). To reiterate: you do not need information theoretic security to defend against satcom intrusions. That's tantamount to using a one time pad, which is ridiculously inefficient. The computational model is just fine. Even the United States government only rarely uses information theoretic security these days, and instead opts for the more efficient computational security provided by e.g. AES.
None of the asymmetric schemes are in widespread use and the blurbs for all except the Lattice one mention major deficiencies.
It's could be that convenient public-private-key schemes like RSA are just impossible to do in a quantum world. Though to be fair it could also be true classically - their security hasn't been proven.
Given the current state of quantum computing and recent rates of progress, it is highly unexpected that a quantum computer that can compromise RSA 2048 or comparable discrete logarithm-based public key cryptosystems will be built within the next decade.
The report can be found here: https://www.nap.edu/catalog/25196/quantum-computing-progress...
A highly skeptical but less academic article is here: https://spectrum.ieee.org/computing/hardware/the-case-agains...