For factoring 2048 RSA integers, the technique proposed in the paper would require ~430 million memory qubits (see the table at top of page 16).
It might not look that way because there's a lot of (relatively) mainstream investment in quantum computing. However it's pretty common for speculative physics research to be pursued for years without ever coming to fruition. Especially when there are promising early results before it's shown that scaling the work reduces to an intractable problem.
Of course, I’m saying it’s reasonable to be more optimistic, not that we have a proof we will certainly be able to scale to enormous machine sizes. But it’s definitely more than “speculative physics”: real machines have been built and demonstrated to exhibit truly measurable quantum effects that allow for programmable computation.
(To be sure, there is hype, there is a lot of cash sloshing around, and there are totally bogus claims some companies are publicly making.)
No they didn't. Not in any meaningful sense.
How exactly what that machine does can be called a "computation"? in what sense is it "programmable"?
The gate-model systems have shown that they are pursuing a much more difficult path, one that may indeed be fruitless for years or decades before they can approach our raw qubit count. We also have examples of nontrivial, paying-customer use cases that become more compelling with each new announcement. Factoring integers is indeed an interesting hard problem which would have a massive (negative?) impact on the world if realized, but besides Shor's and Grover's algorithms, it's not like gate-model QPUs have a ton of use cases significantly better than what a quantum annealer can accomplish.
I like to think of it this way: we're basically at the point of an ENIAC scale machine, if you liken quantum computing progress to classical computing. Fills up a room, very specific environmental and power requirements, little or no "memory" to speak of, esoteric and hard for anyone without years of training to master. Only a few decades later, the state of the art machine was thousands of times more capable, far cheaper and smaller, more reliable, more accessible in every way. Imagine describing the Internet as we use it today to an ENIAC operator, or a speculative investor considering IBM, Honeywell, etc. - it would sound like an impossible, Asimovesque dream, not something that children would literally be playing with sixty years later.
The only difference is that so far, we don't really seem to have a real exponential Moore's Law effect in quantum computing. Google et al. still have very low numbers of qubits without any real promise that they'll be able to deliver more of them in any consistent timeframe. At D-Wave we've done better on the scaling front, and we've been trying to keep up to our former founder's "Rose's Law" of qubit scale growth, but fabrication is an incredibly expensive, complicated, competitive endeavour that necessitates incredible quality control in order to produce processors that are up to spec. There are also other factors beyond the raw qubit count; the bigger advantage in our latest Advantage chip may actually be the higher connectivity between qubits on the graph, rather than their raw number.
Of course, we expect that we'll continue to push the envelope in this regard, and given enough time and investment, some of the early applications we're seeing now may well eventually be integrated into large scale products people use every day.
...would be like bringing mathematics to Mesopotamia.
It could also be possible to use the technology developed for the precise control and measurement of qubits to "rebuild" natural phenomena like the interaction of chemical molecules, something which is currently extremely hard to simulate.
No it won't
We don't know either way. relationship between BPP, BQP, P, NP are all open.
Most of the time, the 'weight' flows back and forth between a and b according to certain equations over time. When you measure the system- that is, when the bit interacts with the outside world, hopefully your measuring apparatus- you see a 1 or a 0, with probabilities |a^2| and |b^2| respectively.
So what you can do is get a whole bunch of these quantum bits- qubits together, and set things up so that the time-evolution of their quantum state is correlated and probabilistically moves towards something you're interested in. Say you can set things up so the bit array- which, at first, will give you a mere perfectly random bit string on measurement- becomes more and more likely to give you, say, a prime factor, or the answer to some other question.
So yes, the quantum phenomenon is that the bits of the computer are quantum objects as opposed to classical.
(qubits I've seen explained many times, but setting things up so that qubits are probabilistically correlated is the part I've never understood anyone else to be saying)
The math on quantum computers checks out, it's "just" an engineering challenge at this point, and many are doubtful whether these challenges will ever be overcome to build a quantum computer of sufficient complexity.
Essentially, some "unitary evolutions" are complex to implement, as in requiring a lot of quantum "gates". This causes an accumulation of error and a whole lot of other problems, which limits the complexity of the calculations that can currently be performed.
I guess you meant to write high power as in high electrical power consumption? Or low power in the sense of low processing power?
Anyway: Performance per Watt is probably pretty bad for current quantum computers ;)
I don't think I understand this metaphor.
We may not have 13k qubit systems today, but we do have qubit systems already. Expecting us to get better at them is pretty reasonable.
Quantum systems don't scale that way. In order to get the quantum speedup you need to be able to maintain the larger quantum state which gets a lot harder the larger these systems get.
This is like saying we already have 7nm process today for silicon so expecting us to get better, like 1nm, 10 angstrom... or we have a plane that goes 2000 miles per hour, expecting 10kmph, 100kmph, 2000kmph... physics doesn't work like that.
How expensive is quantum memory relative to quantum compute, over the long term? Expectations about the answer to this question strongly affect whether or not you find this paper relevant. And the answer depends on the pieces you build your quantum computer out of, e.g. hypothetical photonic architectures have a bigger ratio than hypothetical superconducting qubit architectures.
It is currently very much an open question whether or not quantum computers will have a memory hierarchy.
After DRAM goes SSD and after SSD goes disk. The difference in price per GB for SSD and disk is about four (4x) times, I looked for that numbers recently. The difference between tape and disk is, again, about 4-10 times (from memory).
Also, please consider bandwidth here. 1TB of RAM can max at ten times the bandwidth of 1TB SATA hard disk drive, if not more. The difference in price is 80-fold, but if you factor in bandwidth requirements, the price difference drops to lower levels (10 drives, more SATA or specialized controllers, etc).
If the price/density difference was anywhere near that much you'd see a lot more chips with fat onboard DRAM caches.
Is this also a hypothetical structure, or have they been built and shown to be able to reliably store and retrieve quantum state in such spatial modes? 430 million is a much, much larger number than their headline 13436 qubits...
Also it does not seem that there's an exponential grow in this area: https://www.statista.com/statistics/993634/quantum-computers....
They hit the wall in 2017.
We should be safe for now :)
There are claims of bigger factored numbers, but they exploited special cases (e.g. factors differing by only two bits) and have no hope of being extended to attack cryptography.
A 2019 paper manged to factor 21 on a 16 qubit ibmqx5, but failed to go up to 35:
> the algorithm fails to factor N=35. This is due to the cumulative errors coming from the increasing number of two-qubits gates necessary to implement the more complex MEF needed for this case
If computers that you build are quantum,
Then spies of all factions will want 'em.
Our codes will all fail,
And they'll read our email,
Till we've crypto that's quantum, and daunt 'em.
To read our E-mail, how mean
of the spies and their quantum machine;
Be comforted though,
they do not yet know
how to factorize twelve or fifteen.
Luckily there are asymmetric algorithms which are are secure against quantum computers, so we don't have to resort to quantum-key-exchanges.
When it comes to metered data, a few megabytes to access a new site sounds like normal internet to me. And you could use a trusted proxy/VPN; there are present-day apps that more or less fill that niche already.
A trusted proxy works if ECC is secure (no quantum computers), but if quantum attacks are practical then there would be no way to verify that the trusted proxy is actually what you think it is.
If you wait a small number of years you'll be able to get twice the ram at the same price. So "it would barely fit in ram if nothing else is running" doesn't seem like a significant barrier to me.
> A trusted proxy works if ECC is secure (no quantum computers), but if quantum attacks are practical then there would be no way to verify that the trusted proxy is actually what you think it is.
Your trusted proxy would be using one of the secure multi-megabyte keys, of course. Am I missing something about that arrangement? If signatures would also be enormous then we have a more significant problem to deal with than key size.
Also, my understanding is that almost all of these factorizations utilize a "compiled" version of Shor's algorithm. Meaning that you need to know the factors in advance. So it's essentially "confirming" rather than "finding" the factors.
According the first link in my post, the numbers bigger than 21 were chosen to have specific mathematical properties and attacked with algorithms even less realistic that compiled Shor.
Edit: for anyone interested in learning more about quantum computation, I have a list of resources on my website .
The numbers reported in the press are physical qubits not logical qubits. You need multiple physical qubits + error correction to create a single logical qubit. The main type of error correction used today is something called "surface codes". With this type of error correction it's estimated that MILLIONS of physical qubits will be required to create a SINGLE fully error corrected logical qubit.
We do not have actual quantum computers today and we don't seem to be much closer to having them than we were a decade ago. What we have are really interesting quantum science experiments that get misrepresented by the press (and a handful of companies with a commercial interest in doing so).
Admittedly, it's not a gate-model machine that can run Shor's Algorithm, but it is a quantum computer, and at more than double the number of qubits plus far higher inter-qubit connectivity than our previous D-Wave 2000Q, it definitely demonstrates tremendous progress.
If you have a moment, you can sign up to use it for free at https://cloud.dwavesys.com - we have an online IDE, Jupyter notebook training material and tons of docs, a community forum, and of course some shiny demos that submit problems to the live QPU if you want to try them out.
> DJB yelling from the back of the room "How much RAM does the NIST benchmarking machine have??" Dustin Moody replying "Dan, we're not benchmarking pqRSA!"
Here's his explanation of the idea:
When I say Quantum Computing is a bullshit field, I don’t mean everything in the field is bullshit, though to first order, this appears to be approximately true. I don’t have a mathematical proof that Quantum Computing isn’t at least theoretically possible. I also do not have a mathematical proof that we can or can’t make the artificial bacteria of K. Eric Drexler’s nanotech fantasies. Yet, I know both fields are bullshit. Both fields involve forming new kinds of matter that we haven’t the slightest idea how to construct. Neither field has a sane ‘first step’ to make their large claims true.
“quantum computing” enthusiasts expect you to overlook the fact that they haven’t a clue as to how to build and manipulate quantum coherent forms of matter necessary to achieve quantum computation. A quantum computer capable of truly factoring the number 21 is missing in action. In fact, the factoring of the number 15 into 3 and 5 is a bit of a parlour trick, as they design the experiment while knowing the answer, thus leaving out the gates required if we didn’t know how to factor 15. The actual number of gates needed to factor a n-bit number is 72 x n^3; so for 15, it’s 4 bits, 4608 gates; not happening any time soon.
BTW, the term post-quantum encryption is delicious because we had never had the quantum encryption era. Like the post-high speed era rail in the US ;)
So you can talk ante-quantum and post-quantum in this sense, but it makes little sense to talk about just 'quantum', in exactly the same way that we schedule a lot of things AM and PM but very little for precisely noon.
Even in the key exchange case, I'm personally pretty skeptical of its real-world usefulness because it's not really solving a problem we actually have: public key cryptography (even slower post-quantum variants) work just fine and seem a bit more practical than building some quantum infrastructure to send entangled qubits over to augment a secret that's already exchanged.
Edit: Cursorily googled and did not find a reference for usage but plenty of papers talking about potential applications. Might be wrong though.
This is the biggest crypto puzzle: find private key of Sathoshi Bitcoin wallet with 1 mln bitcoins. Over $50 Bln prize for one crypto puzzle.
This would be AlphaGo moment of quantum computing if you could make that one attack successful even while paying huge price (e.g. years of quantum datacenter work).
Approximately one minute after you start this process, someone will figure out that either (1) bitcoins no longer securely belong to anyone or (2) Satoshi thinks selling off all his bitcoin is a good idea.
Approximately two minutes after you start this process, the price of bitcoin will plummet.
Sorry, you will not be taking home $50B today.
Anyone can sell bitcoins and make money. But if you have the power to destroy bitcoin, well then you can short the entire bitcoin market and clean up.
Frankly, I would just walk into the door of a large hedge fund and sell them to the key for $5B. They'll do far better than I ever could, and $5B is more money than I can possibly use in my lifetime, while being a tiny cost of business for them.
I'm not sure of the details and my understanding of it is limited. But it seems there is some kind of statistical anomaly in the nonces of early blocks, and some people suspect that he purposely formed this anomaly the leave a clue or encode the private key for the wallet.
As long as you never reuse an address bitcoin would continue to be resistant to quantum attacks even if you can factor private keys from public keys, you still need to invert a hash function to go from address to public key.
I would guess hypothetical
that means the past is compromised, with some amount of implementation afterwards. i've always wondered just how much the future is compromised.
i've always thought about encryption this way:
P = some degree of computational power
A = some small unit of P, like a laptop
B = the largest unit of P practically
possible under the same laws of physics as A
(data encrypted by A cannot be "cracked" by B in a reasonable amount of time)
edited for format, then again for clarity
We'll just migrate to quantum resistant algorithms like we migrated away from MD5.
Of course nobody knows, but these things are still very far from anything that is running today.
An attacker can record the key exchange. That is not using RSA, today it's usually some variation of elliptic curve diffie hellman. But that is just as vulnerable to quantum attacks as RSA.
So you're attacking the key exchange, not the RSA signature.
What you're probably alluding to here is the forward secrecy property of TLS. But that is only true under the assumption that the key exchange is secure. In a quantum attack scenario it is not.
So a store-and-decrypt-later attack is still a very valid concern around TLS and quantum computers.
If you don't use ephemeral keys that means that compromising the sever (hacking, subpoena, etc) that will allow an attacker to decrypt any session they recorded which used that long term key.
If you use ephemeral keys, the attacker needs to learn the ephemeral private key instead of the long term private key to decrypt the session. The server throws away the ephemeral private key (plus the symmetric session key) after a short time. So a later compromise of the server will not allow an attacker to decrypt old sessions.
An attacker who can break the underlying cryptography on the other hand can still break the ephemeral keys used for the connection, since they (unavoidably) learn the public key from the handshake. It might increase cost, because they need to break each connection separately, but at that point the security margin is terribly thin.
There is a small benefit, because now an attacker needs to attack each session separately, instead of attacking a single key to break all sessions with a particular sever. But that's hardly something you want to rely on.
Consider that Internet traffic is recorded today and has been for some time. This means that the certificate rotation isn't the entire solution. Algorithm types and sizes along with ephemeral certificates should be considered.
Also, always remember that ECC is believed to be just as weak against quantum computers as RCA. So you'll probably want some fusion of pre and post-quantum algorithms.
Or, anyway, get your perfect forward secrecy working and don't rely on asymmetric crypto for confidentiality. Yeah, PFS is very hard to get on some use cases, but it's really the best way to solve this problem.
RSA is in fact somewhat notable for being the only widely used asymmetric cryptosystem where the underlying operation is block cipher-ish encryption primitive and not key agreement primitive.
Debatable. It's sort of true of RSA-OAEP. That's not true of RSA-PSS (signatures), nor of RSA-KEM (key encapsulation). Really it's the OAEP that's block-cipher like, and the fundamental RSA operation (modular exponentiation) isn't "encryption" or "signing" or "decryption" or "encapsulation" or anything else cryptographic on its own.
I hope someone gets a grant to work out the engineering difficulties in this.
What is still hypothesized is whether these elements, which have been demonstrated to work in small numbers (<100), will work in large numbers.
Quite sure cryptographers are ahead of the curve, they are a conservative bunch. There's multiple finalists in the NIST post-quantum comp with different quantum-hard mathematical properties. An over-abundance of lattice cryptography being standardised could possibly be a problem in the long term though.
Asymmetric public key crypto and signatures being broken is the only real threat, there's no theoretical issues with symmetric crypto, hashing or MAC's.
A quantum break already has a multi-billion dollar bounty on it in the form of cryptocurrency wallets which would be hard to fight against in the form of wages or violent coercion. This is probably the best evidence on Earth that no one actually has this capability.
If a particular address has never sent coins then an attacker has to perform a pre-image attack against SHA-256 and RIPEMD-160 which even for quantum computers would be 2^128 operations (generally believed to be intractable) and wouldn't even require breaking ECDSA.
Since most original coins have never been sent anywhere there isn't a way to directly attack their ECDSA keypairs.
There are, however, plenty of addresses holding (a lot of) coins whose public keys are in the blockchain, making them targets for quantum computing attacks. An attacker could solve the discrete logarithm problem for a key and forge a signature in a transaction sending all the coins owned by the matching signature to a new address owned by the attacker and the transaction would be accepted by everyone.
Note I'm not saying that I actually think someone has achieved this, I just don't think "no one is stealing btc" is a good non-existence test.
There's a standardization process going on for post quantum cryptography at the US NIST. Results expected before the almighty RSA-breaking quantum computer arrives.
There's still a concern about store-and-encrypt-later (i.e. someone can store encrypted communication today and decrypt it once a QC is available), and how relevant that is depends on some unknowns (how many years to you expect your comms to be secret? how many years till a usable QC is available?).
Note that the best known quantum attacks on ChaCha20 cut the key size in half, so 256-bit ChaCha20 should still be fine, as long as your key agreement protocol is quantum-resistant.