"Our batch prime-generation algorithm suggests that, to help reduce energy consumption and protect the environment, all users of RSA — including users of traditional pre-quantum RSA — should delegate their key-generation computations to NIST or anohter trusted third party. This speed improvement would also allow users to generate new RSA keys and erase old RSA keys more frequently, limiting the damage of key theft."
If you told me this was a parody of NSA disinfo, I'd believe it. But apparently, it's a serious paper by djb and Heninger. What happened? Did they finally crack djb, maybe after tying him to the Appelbaum mess? I had hopes for him because ``Keeping crypto insecure'' was talking about stuff TLAs certainly didn't want to see in the spotlight, but this is incredibly disappointing. When I read this passage for the first time I actually laughed for five minutes straight because it was so ridiculous.
"However, all trusted-third-party protocols raise security questions (see, e.g.,  and ), and there are significant costs to all known techniques to securely distribute or delegate RSA computations. The challenge here is to show that secure multi-user RSA key generation can be
carried out more efficiently than one-user-at-a-time RSA key generation."
The point is that post-quantum RSA relies on massive keysizes that make generating single keys extremely expensive in comparison to today's state-of-the-art, and that it's possible to speed it up by making many keys at the same time. The comment about delegating to NIST was probably meant as a joke, not as any serious consideration.
We'd probably be far better off coming up with mechanisms to generate strongly probabilistically unique psudeo-random domains from which we could (on a per cryptographic context basis) generate strong pseudo-primes.
This could most likely be done in a reasonably efficient manner, without resorting to the government backdoor suggested.
That equation flips if quantum computing becomes a real threat, and the numbers in the paper don't appear to change that at all: the key sizes theorized here are, for instance, far bigger than the keys we use in RLWE schemes.
As others here have noted: the paper we're talking about is not entirely serious.
I seem to recall that some schemes have issues with keys being too large because of things like prime scarcity.
He's merely suggesting the same thing: use a really large key (terabit size), and since quantum computers are quite exotic, it will be an unfavorable avenue of attack.
I think there's a much better chance we move off to still slower, but not as slow, quantum-resistant crypto algorithms, than just much larger RSA keys.
>> We can't even get people off SHA1 to SHA2 "because of lower performance." Somehow I doubt the "1 terrabit RSA key" would go over well with them.
This seems like the real point of this paper, no? The rest seems like a joke.
Not every research needs to produce a result that is ready-to-use in everyone's home.
I don't know if quantum computers exist, but I'm sure once they do, the people who build them will keep them secret.
The hard part is hardware, and you need the minimum number of qubits to be of any use to breaking RSA. That number is on the order of O(N^2), where N is the number of bits--so you need qubits on the order of hundreds of thousands if not millions. Where the state of the art is around 10.
 D-Wave is approaching this equivalency, for certain definitions of quantum computers.
They not only do exist but have already successfully factorized low magnitude numbers . The question is if an agency with a high enough budget, secrecy and highly qualified personnel did not already scale up the reliability and capacity of existing technology.
Probably right. UNLESS it's a private company and it's losing bids from government agencies that would otherwise pay for silence. And in this case, you could assume two quantum computer designs exist and the loser is now looking for new markets/customers.
The storage space and network bandwidth is not free.