Hacker News new | past | comments | ask | show | jobs | submit login
Factoring 2048 RSA integers in 177 days with 13436 qubits and a multimode memory (arxiv.org)
240 points by athul7744 21 days ago | hide | past | favorite | 144 comments



This paper basically explores a hypothetical scenario where scaling quantum memory ends up being cheaper than scaling computational qubits. The title (or abstract) unfortunately does not mention the quantum memory requirements at n=2048 explicitly.

For factoring 2048 RSA integers, the technique proposed in the paper would require ~430 million memory qubits (see the table at top of page 16).


So magic computer could be more effective with more magic memory than with more magic processing power, right?


Quantum computers exist today, they're just very low power, low gate counts, and extremely expensive. Don't discount it as magic just because of the claims. Scaling up QC would be like bringing mathematics to Mesopotamia. But it isn't "magic", it's physics.


The research in this field proceeds under the umbrella and framework of physics, yes. But it's not clear that it's possible to scale up to the number qubits required do anything nontrivial. It's plausible there are hard engineering limits on error correction which make it asymptotically more difficult with each order of magnitude more qubits involved.

It might not look that way because there's a lot of (relatively) mainstream investment in quantum computing. However it's pretty common for speculative physics research to be pursued for years without ever coming to fruition. Especially when there are promising early results before it's shown that scaling the work reduces to an intractable problem.


I think it’s reasonable to be more optimistic than you put it. Qubit counts and qubit fidelity have been increasing at a remarkable rate. Just five years ago we could barely eek out a handful of qubits, and when we did, they’d be bad. Google’s quantum supremacy result is a testament to that. (At this stage, whether they actually demonstrated the “supreme” part of supremacy is, imho, irrelevant. They’ve demonstrated a much larger, controllable, programmable quantum computer and measured its quality characteristics accurately.)

Of course, I’m saying it’s reasonable to be more optimistic, not that we have a proof we will certainly be able to scale to enormous machine sizes. But it’s definitely more than “speculative physics”: real machines have been built and demonstrated to exhibit truly measurable quantum effects that allow for programmable computation.

(To be sure, there is hype, there is a lot of cash sloshing around, and there are totally bogus claims some companies are publicly making.)


The argument here is that reducing error rate/supporting error correction may be exponentially difficult, a bit like increasing clock frequency is now in classical computers.


> They’ve demonstrated a much larger, controllable, programmable quantum computer

No they didn't. Not in any meaningful sense.

How exactly what that machine does can be called a "computation"? in what sense is it "programmable"?


Admittedly, it's not a gate-model system, but at D-Wave we've been able to scale up to 5,000+ qubits now, and the future is promising for further advancements in qubit count, noise, and other advanced features.

The gate-model systems have shown that they are pursuing a much more difficult path, one that may indeed be fruitless for years or decades before they can approach our raw qubit count. We also have examples of nontrivial, paying-customer use cases that become more compelling with each new announcement. Factoring integers is indeed an interesting hard problem which would have a massive (negative?) impact on the world if realized, but besides Shor's and Grover's algorithms, it's not like gate-model QPUs have a ton of use cases significantly better than what a quantum annealer can accomplish.

I like to think of it this way: we're basically at the point of an ENIAC scale machine, if you liken quantum computing progress to classical computing. Fills up a room, very specific environmental and power requirements, little or no "memory" to speak of, esoteric and hard for anyone without years of training to master. Only a few decades later, the state of the art machine was thousands of times more capable, far cheaper and smaller, more reliable, more accessible in every way. Imagine describing the Internet as we use it today to an ENIAC operator, or a speculative investor considering IBM, Honeywell, etc. - it would sound like an impossible, Asimovesque dream, not something that children would literally be playing with sixty years later.

The only difference is that so far, we don't really seem to have a real exponential Moore's Law effect in quantum computing. Google et al. still have very low numbers of qubits without any real promise that they'll be able to deliver more of them in any consistent timeframe. At D-Wave we've done better on the scaling front, and we've been trying to keep up to our former founder's "Rose's Law" of qubit scale growth, but fabrication is an incredibly expensive, complicated, competitive endeavour that necessitates incredible quality control in order to produce processors that are up to spec. There are also other factors beyond the raw qubit count; the bigger advantage in our latest Advantage chip may actually be the higher connectivity between qubits on the graph, rather than their raw number.

Of course, we expect that we'll continue to push the envelope in this regard, and given enough time and investment, some of the early applications we're seeing now may well eventually be integrated into large scale products people use every day.


Saying "besides Grover's algorithm" strikes me as sort of an "All right, but apart from the sanitation, the medicine, education, wine, public order, irrigation, roads, a fresh water system, and public health, what have the Romans ever done for us?" sort of qualification. Grover's is pretty widely applicable, isn't it?


I thought the Quantum threshold theorem said that it was possible to scale up quantum computers as much as you like? Or am I misinterpreting it?

https://en.wikipedia.org/wiki/Quantum_threshold_theorem


  ...would be like bringing mathematics to Mesopotamia.
Can you expound on this? What sort of breakthroughs are bottlenecked by developments in quantum computing?


The hope is that quantum computers will give us an exponential speed up on some problems, which could (again, hopefully) allow us to solve some NP hard problems. This can be extremely useful for scientific computing (e.g. protein folding) and engineering, where computers have to solve complex NP-hard optimisation problems.

It could also be possible to use the technology developed for the precise control and measurement of qubits to "rebuild" natural phenomena like the interaction of chemical molecules, something which is currently extremely hard to simulate.


>which will allow us to solve some NP hard problems

No it won't


> No it won't

We don't know either way. relationship between BPP, BQP, P, NP are all open.


Could you elaborate?


I do not know much about quantum computing. But could you explain what makes these computers quantum? Is it the configuration of these transistors to invoke some quantum phenomena?


In a classical computer, every bit of information in the system is in a definite state- 1 or 0. In a quantum system with such definite possible states, what you actually have most of the time of the system in some interpolation of the possible states- so in the quantum computer case each bit is usually in a state a1 + b0, where a and b are complex numbers such that |a^2|+|b^2| = 1.

Most of the time, the 'weight' flows back and forth between a and b according to certain equations over time. When you measure the system- that is, when the bit interacts with the outside world, hopefully your measuring apparatus- you see a 1 or a 0, with probabilities |a^2| and |b^2| respectively.

So what you can do is get a whole bunch of these quantum bits- qubits together, and set things up so that the time-evolution of their quantum state is correlated and probabilistically moves towards something you're interested in. Say you can set things up so the bit array- which, at first, will give you a mere perfectly random bit string on measurement- becomes more and more likely to give you, say, a prime factor, or the answer to some other question.

So yes, the quantum phenomenon is that the bits of the computer are quantum objects as opposed to classical.


this is the most comprehensible entry-level description of quantum computers I've ever read. thank you.

(qubits I've seen explained many times, but setting things up so that qubits are probabilistically correlated is the part I've never understood anyone else to be saying)


They're built off fundamentally different basic units. Contemporary computers use transistors, which occupy traditional physics and do logic with voltage thresholds. Quantum computers use a quantum phenomenon -- one easy-to-understand (and fairly easy to construct) quantum computer substrate uses the spin of electrons in superconducting loops. Electron spin is a quantum phenomenon, in that the spin isn't deterministically positive or negative, it's a probability distribution -- initially, equally likely to be positive or negative, but you can't tell what it is until you actually read it. It's not 0.0, it's either -0.5 or 0.5, both with a 50% chance. Equally importantly, you can perform (physical) operations on electrons singly or in pairs in order to manipulate these probabilities. Quantum computing is turning these probability fields and operations into useful computational results.


This is exactly what I wanted to hear! Awesome, thank you!


Here is a long-ish video that explains quantum computers without using pop-science hand-wavy terminology

https://www.youtube.com/watch?v=F_Riqjdh2oM


A quantum computer is defined as a machine that can implement an arbitrary "unitary evolution" (up to arbitrary precision). If you don't know enough math to understand what a unitary evolution is, think of it as a quantum computer doing basically any possible thing that you can do with a given number of qubits. As another comment said, the idea is to (ab)use this to do some operations on qubits such that they end up in a state where measuring it will give you an answer to something you're interested in. It has very little to do with classical programming.

The math on quantum computers checks out, it's "just" an engineering challenge at this point, and many are doubtful whether these challenges will ever be overcome to build a quantum computer of sufficient complexity.

Essentially, some "unitary evolutions" are complex to implement, as in requiring a lot of quantum "gates". This causes an accumulation of error and a whole lot of other problems, which limits the complexity of the calculations that can currently be performed.


> they're just very low power,

I guess you meant to write high power as in high electrical power consumption? Or low power in the sense of low processing power? Anyway: Performance per Watt is probably pretty bad for current quantum computers ;)


"Processing power" is colloquially the term most people I know use to describe computers of varying capabilities.



> would be like bringing mathematics to Mesopotamia

I don't think I understand this metaphor.


You know that qubit processors already exist, right? This is like going "yeah right, we can factor large primes if we had a 4.2GHz processor. So, magical processing power, right?" in the 386 era.

We may not have 13k qubit systems today, but we do have qubit systems already. Expecting us to get better at them is pretty reasonable.


Not sure it is reasonable.

Quantum systems don't scale that way. In order to get the quantum speedup you need to be able to maintain the larger quantum state which gets a lot harder the larger these systems get.

This is like saying we already have 7nm process today for silicon so expecting us to get better, like 1nm, 10 angstrom... or we have a plane that goes 2000 miles per hour, expecting 10kmph, 100kmph, 2000kmph... physics doesn't work like that.


It's an interesting question whether, over the long term, quantum hardware will match the feature of classical hardware where memory is significantly cheaper than compute-capable bits in the CPU. Hard drives are 100x cheaper per bit than RAM which is 100x cheaper (I think?) per bit than a CPU register.

How expensive is quantum memory relative to quantum compute, over the long term? Expectations about the answer to this question strongly affect whether or not you find this paper relevant. And the answer depends on the pieces you build your quantum computer out of, e.g. hypothetical photonic architectures have a bigger ratio than hypothetical superconducting qubit architectures.

It is currently very much an open question whether or not quantum computers will have a memory hierarchy.


You are off by orders of magnitude. The difference in memory density between DRAM and SRAM (register file) is not 100 times, more like 10x. Standalone "registers" - memory elements of pipelines, state machines etc, - are again not more than ten times less denser than SRAM.

After DRAM goes SSD and after SSD goes disk. The difference in price per GB for SSD and disk is about four (4x) times, I looked for that numbers recently. The difference between tape and disk is, again, about 4-10 times (from memory).


Are you sure? I was just going off a quick search of Amazon, where 1 GB of ram cost ~30$, 1 TB of disk cost ~50$, and a CPU with ~1MB of L2 cache cost ~200$. Based on that I actually thought 100x was a comfortable underestimate, not an overestimate.


CPU cost is not dominated by the cache size is the problem, I think.


But I think that cache is roughly 50% of marginal production cost of the silicon chip die by mm2.

https://cdn.wccftech.com/wp-content/uploads/2020/11/AMD-Ryze...

https://images.anandtech.com/doci/16214/Zen3_arch_19.jpg


Cache is not memory, it is more complex - LRU and all that, often with several access ports (quadratic space dependence). SRAM is memory that is on-die and it is much cheaper.


Why don't you compare 1TB of DRAM with 1TB of hard disk?

Also, please consider bandwidth here. 1TB of RAM can max at ten times the bandwidth of 1TB SATA hard disk drive, if not more. The difference in price is 80-fold, but if you factor in bandwidth requirements, the price difference drops to lower levels (10 drives, more SATA or specialized controllers, etc).


https://en.wikipedia.org/wiki/EDRAM

If the price/density difference was anywhere near that much you'd see a lot more chips with fat onboard DRAM caches.


I'm trying to make sense of the 'memory qubits' e.g. 'spatial modes' in the paper. Do you know if they are they a sort of...quantum flip flop?

Is this also a hypothetical structure, or have they been built and shown to be able to reliably store and retrieve quantum state in such spatial modes? 430 million is a much, much larger number than their headline 13436 qubits...


interesting qeustions that is not related to the feasibility of this idea: would this news item cause some more fluctuations in the rate of digital currencies?


The results of this could, however this is simply theoretical at the current time.


Just to be clear such a machine has not yet been built. This is only a theoretical paper at the moment.


Do I understand correctly, that largest quantum computer that exists today contains less than 100 qubits?

Also it does not seem that there's an exponential grow in this area: https://www.statista.com/statistics/993634/quantum-computers....

They hit the wall in 2017.

We should be safe for now :)


Those quantum computers can't execute Shor's algorithm, which is required to attack RSA. AFAIK the best relevant factoring result is still 21=3*7 from 2012.

There are claims of bigger factored numbers, but they exploited special cases (e.g. factors differing by only two bits) and have no hope of being extended to attack cryptography.

https://crypto.stackexchange.com/questions/59795/largest-int...

A 2019 paper manged to factor 21 on a 16 qubit ibmqx5, but failed to go up to 35:

> the algorithm fails to factor N=35. This is due to the cumulative errors coming from the increasing number of two-qubits gates necessary to implement the more complex MEF needed for this case

https://arxiv.org/abs/1903.00768


Jennifer and Peter Shor wrote a limerick that seems relevant:

   If computers that you build are quantum,
   Then spies of all factions will want 'em.
   Our codes will all fail,
   And they'll read our email,
   Till we've crypto that's quantum, and daunt 'em.
And Volker Strassen responded at a conference:

   To read our E-mail, how mean
   of the spies and their quantum machine;
   Be comforted though,
   they do not yet know
   how to factorize twelve or fifteen.
Source: http://www-math.mit.edu/~shor/notapoet.html


> Till we've crypto that's quantum, and daunt 'em.

Luckily there are asymmetric algorithms which are are secure against quantum computers, so we don't have to resort to quantum-key-exchanges.


"Secure against quantum" doesn't really mean much because too little is known to make that claim confidently. AFAIK the term generally refers to algorithms that don't rely on factoring being hard, but instead make some different hardness assumptions that we currently don't have classical or quantum algorithms for.


For practically all computationally secure crypto we use "secure" for "no known attacks faster than we'd like". QCs just extend the set of efficient algorithms.


Indeed, but there is a much longer history of people trying and failing to break the schemes, and we have come to understand the hardness assumptions in classical crypto as probably quite reasonable.


There's a similarly long history of people trying and failing to break code-based schemes like Niederreiter and McEliece with binary Goppa codes. Unlike lattice-based schemes they're considered quite secure, but their key sizes are enormous (like, hundreds of KiB to several MiB in size). So they're impractical for embedded security, or any situation where key transfer bandwidth is an issue (IoT, metered data plans, etc).


Does IoT need to transfer keys?

When it comes to metered data, a few megabytes to access a new site sounds like normal internet to me. And you could use a trusted proxy/VPN; there are present-day apps that more or less fill that niche already.


Transfer is one issue, processing is another. If you're running an IoT device on an STM32F423ZH (chosen because my employer has a cell-enabled IoT device on that MCU) you've got 320k of total RAM. Considering there's overhead for the rest of the application, there's no way to use such keys on such a device.

A trusted proxy works if ECC is secure (no quantum computers), but if quantum attacks are practical then there would be no way to verify that the trusted proxy is actually what you think it is.


> Considering there's overhead for the rest of the application, there's no way to use such keys on such a device.

If you wait a small number of years you'll be able to get twice the ram at the same price. So "it would barely fit in ram if nothing else is running" doesn't seem like a significant barrier to me.

> A trusted proxy works if ECC is secure (no quantum computers), but if quantum attacks are practical then there would be no way to verify that the trusted proxy is actually what you think it is.

Your trusted proxy would be using one of the secure multi-megabyte keys, of course. Am I missing something about that arrangement? If signatures would also be enormous then we have a more significant problem to deal with than key size.


The trusted proxy in your scheme exists to avoid needing to deal with the multi-megabyte keys on a device with only a few kilobytes of RAM. It CAN'T use such keys to secure the connection with the embedded device. If it could, it wouldn't be needed.


>There are claims of bigger factored numbers, but they exploited special cases

Also, my understanding is that almost all of these factorizations utilize a "compiled" version of Shor's algorithm. Meaning that you need to know the factors in advance. So it's essentially "confirming" rather than "finding" the factors.


AFAIK factoring 21 is "compiled" Shor, so even that result simplifies the problem a bit.

According the first link in my post, the numbers bigger than 21 were chosen to have specific mathematical properties and attacked with algorithms even less realistic that compiled Shor.


IBM has 1000 qubits on their roadmap by 2023 [1]. I'm interested to see how that goes, and what we can learn from it about scaling up these systems into the thousands of qubits.

Edit: for anyone interested in learning more about quantum computation, I have a list of resources on my website [2].

[1] https://www.ibm.com/blogs/research/2020/09/ibm-quantum-roadm...

[2] https://cameronperot.com/resources/#quantum-physics


Since I didn't see it mentioned, I assume those are 1000 noisy qubits, so they'll require error correction. I wonder how many ideal qubits they are equivalent to.


>largest quantum computer that exists today contains less than 100 qubits

The numbers reported in the press are physical qubits not logical qubits. You need multiple physical qubits + error correction to create a single logical qubit. The main type of error correction used today is something called "surface codes". With this type of error correction it's estimated that MILLIONS of physical qubits will be required to create a SINGLE fully error corrected logical qubit.

https://www.ncbi.nlm.nih.gov/books/NBK538709/

We do not have actual quantum computers today and we don't seem to be much closer to having them than we were a decade ago. What we have are really interesting quantum science experiments that get misrepresented by the press (and a handful of companies with a commercial interest in doing so).


The largest production quantum computer that exists today - as in, the largest you personally can get access to - has 5,436 qubits: the D-Wave Advantage system.

Admittedly, it's not a gate-model machine that can run Shor's Algorithm, but it is a quantum computer, and at more than double the number of qubits plus far higher inter-qubit connectivity than our previous D-Wave 2000Q, it definitely demonstrates tremendous progress.

If you have a moment, you can sign up to use it for free at https://cloud.dwavesys.com - we have an online IDE, Jupyter notebook training material and tons of docs, a community forum, and of course some shiny demos that submit problems to the live QPU if you want to try them out.


And the error rate of 10^-3 seems problematic, as well.


Yeah, I was thinking "Waaait, aren't they at <100 qubits still?"


I would have preferred they title it "How to factor 2048 ..."


So, then I am definitely good using 4096 RSA?


Sure, but ridiculously large key sizes like 4096 bit RSA are not really any more resistant to quantum computing than smaller sizes. This is probably a joke, but the suggestion was made that you might be OK with a 1 TB RSA key size:

* https://www.schneier.com/blog/archives/2017/05/post-quantum_...


That pqRSA paper by DJB is a joke, but it's mathematically correct and an interesting thought experiment - RSA really becomes post-quantum when you use a 1 TB key because it outgrows what Shor's algorithm can scale at that point. The paper is actually better, it proposed an original algorithm, GEECM, that's faster than Shor's algorithm for numbers with many small factors, then also showed pqRSA is safe from GEECM. He even submitted the algorithm to the NIST Post-Quantum Competition for review (among his more practical algorithms like Classic McEliece), it's just hilarious.

> DJB yelling from the back of the room "How much RAM does the NIST benchmarking machine have??" Dustin Moody replying "Dan, we're not benchmarking pqRSA!"

https://crypto.stackexchange.com/questions/59591/why-is-pqrs...

Here's his explanation of the idea:

https://cr.yp.to/talks/2017.06.27/slides-djb-20170627-pqrsa-...


Given the number of Star Wars references in the various NIST Post-Quantum Competition scheme names (CRYSTALS-KYBER, SABER, NewHope) I'm rather sad he didn't call it "Post Quantum RSA - The Phantom Menace".


I fear it is my obligation to point out this excellent screed by Scott Lockin: "Quantum computing as a field is obvious bullshit". A beautiful excerpt from the article:

When I say Quantum Computing is a bullshit field, I don’t mean everything in the field is bullshit, though to first order, this appears to be approximately true. I don’t have a mathematical proof that Quantum Computing isn’t at least theoretically possible. I also do not have a mathematical proof that we can or can’t make the artificial bacteria of K. Eric Drexler’s nanotech fantasies. Yet, I know both fields are bullshit. Both fields involve forming new kinds of matter that we haven’t the slightest idea how to construct. Neither field has a sane ‘first step’ to make their large claims true.

.....

“quantum computing” enthusiasts expect you to overlook the fact that they haven’t a clue as to how to build and manipulate quantum coherent forms of matter necessary to achieve quantum computation. A quantum computer capable of truly factoring the number 21 is missing in action. In fact, the factoring of the number 15 into 3 and 5 is a bit of a parlour trick, as they design the experiment while knowing the answer, thus leaving out the gates required if we didn’t know how to factor 15. The actual number of gates needed to factor a n-bit number is 72 x n^3; so for 15, it’s 4 bits, 4608 gates; not happening any time soon.

[1]: https://scottlocklin.wordpress.com/2019/01/15/quantum-comput...


Obviously this isn't the strongest argument in the world, but, if the whole field is so bullshit why has anyone bothered putting effort into post-quantum encryption? What's the point if quantum computers will never materialize?


It's theoretically possible for someone to record encrypted traffic today, keep a copy of it for a decade or two until quantum computers are available, and then decrypt it in the future using the quantum computer. If you have data today, that needs to be secure for more than a few decades, you may want to move to post quantum crypto soon to stop that potential leak.


Lets rephrase it in terms of product-market fit - what problem is post-quantum encryption going to solve for people to put effort into it?

BTW, the term post-quantum encryption is delicious because we had never had the quantum encryption era. Like the post-high speed era rail in the US ;)


The sense this term is using 'quantum' is as a shorthand for a point in time, where attacking cryptosystems using quantum computers becomes feasible.

So you can talk ante-quantum and post-quantum in this sense, but it makes little sense to talk about just 'quantum', in exactly the same way that we schedule a lot of things AM and PM but very little for precisely noon.


except we already use qubits in the real world, and we already use entangled pairs in banking, so it's not really all that bullshit anymore. Quantum computers with a significant number (e.g thousands, but really, millions) of qubits, however, are still quite a ways away. But clearly not in the realm of bullshit anymore.


Is this referencing quantum cryptography being used in banking? Any good sources for more info on this, it sounds really interesting.


I'm no expert in quantum computation but my understanding is that using qubits for quantum computing is totally different to using qubits to exchange secrets as in cryptographic key exchange. The latter is using the property that observing the quantum state changes the state, thus an eavesdropper can't interfere when you send qubits over the wire to exchange a secret. That seems very different to arranging large amounts of qubits in a particular way to do useful calculations.

Even in the key exchange case, I'm personally pretty skeptical of its real-world usefulness because it's not really solving a problem we actually have: public key cryptography (even slower post-quantum variants) work just fine and seem a bit more practical than building some quantum infrastructure to send entangled qubits over to augment a secret that's already exchanged.


That was my initial reaction - quantum entanglement for crypto is not same as quantum computing and furthermore a hammer in search of a problem that does not exist (yet).


I would love to learn more, please share any relevant useful links.

Edit: Cursorily googled and did not find a reference for usage but plenty of papers talking about potential applications. Might be wrong though.


The authors hide the fact that this would require millions of memory qubits (which need to be as accurate as the normal qubits), so the title is a bit misleading IMHO.


One day you can start calculating private keys based on public keys.

This is the biggest crypto puzzle: find private key of Sathoshi Bitcoin wallet with 1 mln bitcoins. Over $50 Bln prize for one crypto puzzle.

This would be AlphaGo moment of quantum computing if you could make that one attack successful even while paying huge price (e.g. years of quantum datacenter work).


Imagine that you find Satoshi's private key and start trying to sell his million bitcoins.

Approximately one minute after you start this process, someone will figure out that either (1) bitcoins no longer securely belong to anyone or (2) Satoshi thinks selling off all his bitcoin is a good idea.

Approximately two minutes after you start this process, the price of bitcoin will plummet.

Sorry, you will not be taking home $50B today.


You're not thinking big enough.

Anyone can sell bitcoins and make money. But if you have the power to destroy bitcoin, well then you can short the entire bitcoin market and clean up.

Frankly, I would just walk into the door of a large hedge fund and sell them to the key for $5B. They'll do far better than I ever could, and $5B is more money than I can possibly use in my lifetime, while being a tiny cost of business for them.


Following the progress of computations, even if there is a whiff of it getting close (1year) away, Bitcoin would have suffered runs based on theory possibly becoming reality alone


There is a theory that the key to satishi's wallet is hidden somewhere in the beginning of the blockchain. So it might not be too wild if coins started moving


How could it possibly still be secure, then? Also, I've never heard of that theory and it sounds interesting. Source?


https://bitcointalk.org/index.php?topic=5313377.0

I'm not sure of the details and my understanding of it is limited. But it seems there is some kind of statistical anomaly in the nonces of early blocks, and some people suspect that he purposely formed this anomaly the leave a clue or encode the private key for the wallet.


Assuming nothing has been spent previously from the wallet, you would have to break SHA256 before you could even find the public key you needed to factor. Bitcoin addresses are based on hashes of the public key and SHA256 appears quantum resistant.


IIRC public keys are only revealed on bitcoin once you authorize an out going transaction. There are large bitcoin addresses that have never revealed their corresponding public key such as https://www.blockchain.com/btc/address/37XuVSEpWW4trkfmvWzeg...

As long as you never reuse an address bitcoin would continue to be resistant to quantum attacks even if you can factor private keys from public keys, you still need to invert a hash function to go from address to public key.


Cashing out and actually getting $50B may be as much of a challenge as finding the private key


Hypothetically, you could publicly announce that you'll sell the stash at the rate of 1% per year, to demonstrate faith in the continued growth of bitcoin in order to finance <humanitarian project> and reassure the market that it won't massively crash.


And you could even sign the announcement with one of Satoshi's wallet keys.


Why, do people actively monitor that wallet for activity? I'm assuming if any BTC at all moves out of that wallet it'll cause a massive correction to BTC price as people can no longer assume that those bitcoins are off the market.


I'm sure that enough geeks have all kinds of triggers to spot that kind of activity.


I thought bitcoin was based off EC crypto, not RSA.


EC crypto is just as vulnerable to Shor's algorithm as RSA is.


ECC is totally different than RSA principles discussed here.


Besides the number of qbits, is "multimode memory" real or hypothetical?


From the abstract: We suggest realizing such an architecture using a microwave interface between a processor made with superconducting qubits and a multiplexed memory using the principle of photon echo in solids doped with rare-earth ions.

I would guess hypothetical


This reminds me to listen to MC Frontalot's Secrets from the Future again. If you haven't heard it yet, you're in for a treat! https://youtu.be/FUPstXCqyus


That's just 6.56 qubits per factored RSA integer - or alternately, 11.57 days per factored RSA integer. Quite impressive either way!


Thats not what they mean. Title should read '2048 bit'


layman here. i understand that if said theoretical computer did exist, encrypted stored data using today's standards is for the most part compromised, outside of further obfuscation, which the popular opinion seems to believe only helps so much.

that means the past is compromised, with some amount of implementation afterwards. i've always wondered just how much the future is compromised.

i've always thought about encryption this way:

  P = some degree of computational power
  
  A = some small unit of P, like a laptop
  B = the largest unit of P practically
    possible under the same laws of physics as A

  (data encrypted by A cannot be "cracked" by B in a reasonable amount of time)
so in my head, so long as a normal civilian can access qubit technology (likely questionable), encryption still works by increasing the number of rounds. what am i missing?

edited for format, then again for clarity


Quantum computers will have civilian access. They already do in fact. And the issue is that they change _the complexity_ for some algorithms so adding rounds isn't going to help.

We'll just migrate to quantum resistant algorithms like we migrated away from MD5.


Quantum computers of this scale are probably 5-15 years out. Basically this is a warning that if you have secrets that should still be kept secret over that timeframe, you should not be using RSA today.


I'd be willing to bet a lot that it's not 5 years, and am still quite confident it's not 15.

Of course nobody knows, but these things are still very far from anything that is running today.


At this rate, maybe the first commercial Fusion plants will have asymmetric quantum encryption securing their management portal.


no no no, you need high performance enterprise grade http on port 80 listening on all interfaces for a power plant portal


Built with Java(TM) EnterpriseServletContainerGeneratorSingletBean Technology. Has been continuously running since GWB was in office, not a single patch applied!


But even if I agree that it's almost impossible in 5 years it's better to be cautious and assume it could be broken in 5. And there's also the fact it could be broken by some governments long before anyone finds out I guess.


Even if it was, 177days for cracking your password with the most advanced technology that exists in the world, still means you have some very powerful enemies. Practically you don't need to think about it.


I.e. you can solve it by rotating your keys/certificates every 90 days, like the expiry term given by Letsencrypt.


Rotating keys works for authentication, but not for confidentiality, since nothing stops an attacker from recording the ciphertext and decrypting past messages.


In common practice RSA keys/certificates are used for TLS and similar protocols where it does not allow to decrypt past recorded communications, as those are encrypted with an ephemeral session keys that can not be determined from observing the handshake even if you have the keys. Obtaining a private key would allow you to MITM future sessions and decrypt them, but it would not break confidentiality of past messages. Things like PGP-encrypted email files would be vulnerable, but that niche is extremely rare in comparison to other uses of RSA and similar encryption.


That's not how this works.

An attacker can record the key exchange. That is not using RSA, today it's usually some variation of elliptic curve diffie hellman. But that is just as vulnerable to quantum attacks as RSA. So you're attacking the key exchange, not the RSA signature.

What you're probably alluding to here is the forward secrecy property of TLS. But that is only true under the assumption that the key exchange is secure. In a quantum attack scenario it is not.

So a store-and-decrypt-later attack is still a very valid concern around TLS and quantum computers.


Interesting, I had assumed that the whole purpose of Diffie-Hellman and the like is to ensure that the ephemeral keys which are generated during the process can't be recovered from recorded traffic even if you later find out the private keys used by the parties. Or is it the case that it's secure from just knowing the private keys, but efficient factorization e.g. Shor's algorithm can break the whole process?


The server holds the private key associated with the certificate for a long time.

If you don't use ephemeral keys that means that compromising the sever (hacking, subpoena, etc) that will allow an attacker to decrypt any session they recorded which used that long term key.

If you use ephemeral keys, the attacker needs to learn the ephemeral private key instead of the long term private key to decrypt the session. The server throws away the ephemeral private key (plus the symmetric session key) after a short time. So a later compromise of the server will not allow an attacker to decrypt old sessions.

An attacker who can break the underlying cryptography on the other hand can still break the ephemeral keys used for the connection, since they (unavoidably) learn the public key from the handshake. It might increase cost, because they need to break each connection separately, but at that point the security margin is terribly thin.


Diffie hellman is on the table to be cracked too by quantum computers, just as much as RSA.


But those ephemeral session keys can not be determined from observing the handshake because the handshake uses asymmetric cryptographic algorithms which may use exactly the same mathematics as RSA.


Do these attacks take the same amount of time for one round of encryption vs multiple?


RSA is rarely actually used for doing the message encryption. Instead it is used for negotiating other keys. Perfect-forward-secrecy enables you to get message confidentiality even if your key is leaked after the message is sent.


Diffie-Hellman over finite fields or elliptic curves is just as vulnerable to Shor's algorithm as RSA. I believe ECC might even need fewer qubits to break at the same classical security level, since the numbers are smaller.

There is a small benefit, because now an attacker needs to attack each session separately, instead of attacking a single key to break all sessions with a particular sever. But that's hardly something you want to rely on.


Perfect-forward-secrecy relies on that you didn’t break the underlying cryptography primitives, which is not true with QC.


Unfortunately, this doesn't solve the complete issue.

Consider that Internet traffic is recorded today and has been for some time. This means that the certificate rotation isn't the entire solution. Algorithm types and sizes along with ephemeral certificates should be considered.


Also probably you should think about all the other security low-hanging fruit first.


They would have a 50% chance of cracking it within 90 days, I would have thought.


> you should not be using RSA today

Also, always remember that ECC is believed to be just as weak against quantum computers as RCA. So you'll probably want some fusion of pre and post-quantum algorithms.

Or, anyway, get your perfect forward secrecy working and don't rely on asymmetric crypto for confidentiality. Yeah, PFS is very hard to get on some use cases, but it's really the best way to solve this problem.


PFS still involves asymmetric cryptography. And in fact in essentially every modern cryptography stack (ie. (EC)DLP-based) the "PFS primitive" (Diffie-Hellman, ie. scalar multiplication/exponentiation) is the one thing that everything else is derived from.

RSA is in fact somewhat notable for being the only widely used asymmetric cryptosystem where the underlying operation is block cipher-ish encryption primitive and not key agreement primitive.


> RSA is in fact somewhat notable for being the only widely used asymmetric cryptosystem where the underlying operation is block cipher-ish encryption primitive and not key agreement primitive.

Debatable. It's sort of true of RSA-OAEP. That's not true of RSA-PSS (signatures), nor of RSA-KEM (key encapsulation). Really it's the OAEP that's block-cipher like, and the fundamental RSA operation (modular exponentiation) isn't "encryption" or "signing" or "decryption" or "encapsulation" or anything else cryptographic on its own.


Forward secrecy relies on asymmetric crypto, it merely uses short lived/one-time keys instead of long lived keys. The most popular algorithm for ephemeral key exchange is (Elliptic curve) Diffie-Hellman which is just as weak against quantum attacks as RSA. So adding a post-quantum algorithm is just as important for PFS as for plain RSA.


Yes, that's how pqcrypto is done today. Run two key exchanges, one we're pretty sure is solid, and one that might be post-quantum-secure but has some risk of actually not being secure at all, and use a robust combiner on their outputs.


Five years is incredibly close for this type of computer to be built!


It's an interesting theory but like most items in quantum computing it is purely theoretical. Not sure how much it would cost to build.

I hope someone gets a grant to work out the engineering difficulties in this.


I would say most items in quantum computing are not theoretical. They’ve been demonstrated. Moreover, quantum physics is by far our most accurate theory of physics we have

What is still hypothesized is whether these elements, which have been demonstrated to work in small numbers (<100), will work in large numbers.


Until someone successfully demonstrates an error-corrected qubit, it's probably fair to still call a lot of work in the quantum computing realm theoretical.


Could anyone take a stab at the cost of such a computer, if it were possible to build today? Like, I know there are computers of <100 qubits, but how much does one cost?


Side question: if quantum computers fulfill their promise, won't they break encryption as we know it? Are we ready for that kind of upheaval?


> Are we ready for that kind of upheaval?

Quite sure cryptographers are ahead of the curve, they are a conservative bunch. There's multiple finalists in the NIST post-quantum comp with different quantum-hard mathematical properties. An over-abundance of lattice cryptography being standardised could possibly be a problem in the long term though.

Asymmetric public key crypto and signatures being broken is the only real threat, there's no theoretical issues with symmetric crypto, hashing or MAC's.

A quantum break already has a multi-billion dollar bounty on it in the form of cryptocurrency wallets which would be hard to fight against in the form of wages or violent coercion. This is probably the best evidence on Earth that no one actually has this capability.


I think Grover's algorithm could be used to find hashes with quadratic speedup. But that should not be an issue as quadratic slowdown could be achieved by doubling the problem size.


Thanks. I don't understand your last paragraph. Can you provide a bit more context? Just a starting point for an Internet search :)


The bitcoin ledger is a bit of a special case. Coins in a transaction are sent to output addresses identified by a signature which is usually a combination of SHA-256 and RIPEMD-160 hashes of a public key signature. Using coins as an input to a transaction (spending them) requires exposing the public ECDSA key used to originally create the signature.

If a particular address has never sent coins then an attacker has to perform a pre-image attack against SHA-256 and RIPEMD-160 which even for quantum computers would be 2^128 operations (generally believed to be intractable) and wouldn't even require breaking ECDSA.

Since most original coins have never been sent anywhere there isn't a way to directly attack their ECDSA keypairs.

There are, however, plenty of addresses holding (a lot of) coins whose public keys are in the blockchain, making them targets for quantum computing attacks. An attacker could solve the discrete logarithm problem for a key and forge a signature in a transaction sending all the coins owned by the matching signature to a new address owned by the attacker and the transaction would be accepted by everyone.


I think they're saying that we know nobody's using quantum computing to break crypto yet, because nobody's been draining bitcoin wallets.


Maybe, but if a state actor has achieved this ability, I doubt they would make its existence public knowledge by stealing a few billion US$ in btc. That's chump change compared to the value of being able to surreptitiously crack the cryptographically secure messages of everybody in the world.

Note I'm not saying that I actually think someone has achieved this, I just don't think "no one is stealing btc" is a good non-existence test.


Pretty much, it's not just bitcoin either, basically every cryptocurrency wallet is based on elliptical curve cryptography and any that have been used publicly are vulnerable to anyone on Earth having enough coherent qubits at their disposal to crunch the numbers in order to steal it.


blockchain ledgers are open and public and you could think of the contents of any wallet as a bounty for cracking that wallet's private key. some of the wallets involved in early bitcoin transactions would make perfect targets - they havent moved in years and are presumed lost. the btc in them is worth billions.



Yes and kinda yes.

There's a standardization process going on for post quantum cryptography at the US NIST. Results expected before the almighty RSA-breaking quantum computer arrives.

There's still a concern about store-and-encrypt-later (i.e. someone can store encrypted communication today and decrypt it once a QC is available), and how relevant that is depends on some unknowns (how many years to you expect your comms to be secret? how many years till a usable QC is available?).


Note, for instance, that Google ran an experiment where Chrome connecting to Google hosts used both conventional elliptic curve Diffie-Hellman over Curve25519 and post-quantum "New Hope" RLWE. A hash of the results of the two key agreements was used for ChaCha20 encryption of the keystream. A store-and-decrypt attack in this case should be required to break both conventional Curve25519 and experimental New Hope.

Note that the best known quantum attacks on ChaCha20 cut the key size in half, so 256-bit ChaCha20 should still be fine, as long as your key agreement protocol is quantum-resistant.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: