Hacker News new | comments | show | ask | jobs | submit login
Why Quantum Computers Might Not Break Cryptography (quantamagazine.org)
52 points by markhkim 191 days ago | hide | past | web | 23 comments | favorite



I am worried about the following part from the paper <https://cr.yp.to/papers/pqrsa-20170419.pdf>

"Our batch prime-generation algorithm suggests that, to help reduce energy consumption and protect the environment, all users of RSA — including users of traditional pre-quantum RSA — should delegate their key-generation computations to NIST or anohter trusted third party. This speed improvement would also allow users to generate new RSA keys and erase old RSA keys more frequently, limiting the damage of key theft."

If you told me this was a parody of NSA disinfo, I'd believe it. But apparently, it's a serious paper by djb and Heninger. What happened? Did they finally crack djb, maybe after tying him to the Appelbaum mess? I had hopes for him because ``Keeping crypto insecure'' was talking about stuff TLAs certainly didn't want to see in the spotlight, but this is incredibly disappointing. When I read this passage for the first time I actually laughed for five minutes straight because it was so ridiculous.


I read the paper, and it's not clearly so incriminating in context. You've left out the critical immediate following sentence:

"However, all trusted-third-party protocols raise security questions (see, e.g., [19] and [24]), and there are significant costs to all known techniques to securely distribute or delegate RSA computations. The challenge here is to show that secure multi-user RSA key generation can be carried out more efficiently than one-user-at-a-time RSA key generation."

The point is that post-quantum RSA relies on massive keysizes that make generating single keys extremely expensive in comparison to today's state-of-the-art, and that it's possible to speed it up by making many keys at the same time. The comment about delegating to NIST was probably meant as a joke, not as any serious consideration.


I find their assertion somewhat disingenuous, in security things are often done (and known to be and deliberately done) in rather inefficient manners - for example algorithms that run in fixed time despite obvious case specific optimisations.

We'd probably be far better off coming up with mechanisms to generate strongly probabilistically unique psudeo-random domains from which we could (on a per cryptographic context basis) generate strong pseudo-primes.

This could most likely be done in a reasonably efficient manner, without resorting to the government backdoor suggested.


Pretty sure not serious.


There are post-quantum public key algorithms which are much more efficient than terabit-size RSA.

https://en.wikipedia.org/wiki/Post-quantum_cryptography


This seems pretty academic: even if RSA can be kept alive at some enormous key size, the reason we use RSA and not lattices or isogenies is that RSA is more practical.

That equation flips if quantum computing becomes a real threat, and the numbers in the paper don't appear to change that at all: the key sizes theorized here are, for instance, far bigger than the keys we use in RLWE schemes.

As others here have noted: the paper we're talking about is not entirely serious.


TLDR; RSA keys can be generated large enough (1TB was the example in the article) that it would still take an inordinate amount of time to brute force, in spite of the potential speed advantages available to the quantum computer.


Are 1TB keys still secure?

I seem to recall that some schemes have issues with keys being too large because of things like prime scarcity.


That is an interesting argument... Today we assume RSA is "Safe" because we can guess the maximum computational power of an attacker, and using a key size that makes cracking the key an unfavorable avenue for attack.

He's merely suggesting the same thing: use a really large key (terabit size), and since quantum computers are quite exotic, it will be an unfavorable avenue of attack.


We can't even get people off SHA1 to SHA2 "because of lower performance." Somehow I doubt the "1 terrabit RSA key" would go over well with them.

I think there's a much better chance we move off to still slower, but not as slow, quantum-resistant crypto algorithms, than just much larger RSA keys.


  >> We can't even get people off SHA1 to SHA2 "because of lower performance." Somehow I doubt the "1 terrabit RSA key" would go over well with them.
So apparently we are going to need quantum processors to generate encryption that is resistant to quantum processors.


GPG even disabled the generation of RSA keys over 4kib.


> As part of the attack analysis, this paper introduces a new quantum factorization algorithm that is often much faster than Shor’s algorithm and much faster than pre-quantum factorization algorithms.

This seems like the real point of this paper, no? The rest seems like a joke.


So... what is the argument exactly? That quantum computers won't be fast enough to break a key that's larger than most people's hard drives and would be utterly useless in practice?


The argument is that it is possible to use RSA that is post-quantum safe in the real world.

Not every research needs to produce a result that is ready-to-use in everyone's home.


If I had a quantum computer that was capable of breaking RSA, I wouldn't tell anyone. The whole point is to be able to spy on people, and you wouldn't be able to do that if everyone knew that RSA was broken.

I don't know if quantum computers exist, but I'm sure once they do, the people who build them will keep them secret.


A high-end desktop is better at simulating quantum computers than any that we've ever built [1]. The best quantum computers achieve about 10 qubits. Furthermore, quantum operations are slower than classical operations. A classical computer can do a 64-bit operation (depending on clock speed) in ~0.25ns, a quantum computer takes about 5ns. Also, quantum computers don't scale like classical computers. 2 classical computers can solve twice as many problems or problems twice as hard, while 2 quantum computers can only solve twice as many problems.

The hard part is hardware, and you need the minimum number of qubits to be of any use to breaking RSA. That number is on the order of O(N^2), where N is the number of bits--so you need qubits on the order of hundreds of thousands if not millions. Where the state of the art is around 10.

[1] D-Wave is approaching this equivalency, for certain definitions of quantum computers.


> I don't know if quantum computers exist, but I'm sure once they do, the people who build them will keep them secret.

They not only do exist but have already successfully factorized low magnitude numbers [0]. The question is if an agency with a high enough budget, secrecy and highly qualified personnel did not already scale up the reliability and capacity of existing technology.

[0]: https://en.wikipedia.org/wiki/Shor's_algorithm


> I don't know if quantum computers exist, but I'm sure once they do, the people who build them will keep them secret.

Probably right. UNLESS it's a private company and it's losing bids from government agencies that would otherwise pay for silence. And in this case, you could assume two quantum computer designs exist and the loser is now looking for new markets/customers.


I feel like the paper by DJB, which this article is based off of, was some sort of satire.


It absolutely is - one of the keywords is "Make RSA Great Again", ffs.


My question is: how practical would it be to use a 1TB RSA key for average users? I assume that the size of cipher text would be somewhat depended on secret size.

The storage space and network bandwidth is not free.


Who gives a damn about a terabyte size key? what a pointless article.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: