phrasing like "NIST reveals" doesn't quite reveal the huge actual shit they're in and their failure to make progress on this. It might even suggest that having such a huge number (26) is in anyway positive (instead of admitting to their incompetence wrt narrowing the number to a manageable few).
Seriously watch DJB/TL's video for some background on what is actually happening.
Anyone know what "recent events" refers to?
The problem is the key size: 512 kb for normal use, 8.5 Mb against quantum computers. On the other hand, 1 MB is not that much if you really want the security.
They're even trying to replace RSA at ~2kb per key.
Part of it is also the human factor. Crypto is way easier if your keys are short enough to write down. You don't really want to be moving keys around on flash drives, it's hard to secure erase files unless the drives themselves are encrypted.
If we learned anything from the past it's that crypto works and can be deployed if it integrates without users noticing much and if the computational and speed overhead is small (or negative, which in the HTTPS case it often is).
McEliece ciphertexts are unusually small for post-quantum cryptography. The small ciphertext size can be much more important for total traffic than the large key size.
McEliece is relatively fast and has really fast implementations in hardware, so the speed is less off a problem. It does just binary vector operations. Encapsulation is just one pass.
There are plenty of middlebox devices on the Internet that will drop your traffic if you send anything HTTP over them that they don't understand. So you're kinda "locked in" in not deploying new protocol features. Encryption provides a way out of this.
The Caesar cipher worked until it didn't, the Enigma was secure until it was broken, and DES was good until it wasn't.
Physics still has some backdoors (notably reversable computing which, as far as I am aware, has no fundamental limit), but the bigger problem is that complexity theory has not advanced to a point where we can make meaningfull use of this. (The most I have ever seen is computing the lowerbound on energy to enumerate all keys. )
This is probably a solvable problem, but is going to be harder than p vs np.
Edit: I think I was right, as the mathematical term "perfect secrecy" from the theorem is not particularly practical (lack of perfect secrecy does not imply the existence of a way to break the cypher). http://www.cs.miami.edu/home/burt/learning/Csc609.011/Perfec...
It is reasonable to think that we could or have already built a symmetric cipher that may never be broken.
Recall the history of ECC. ECC was proposed in the late-1990. Unlike RSA, we already had a solid understanding of its security properties at that time. However, it took 15 years to actually implement and deploy NIST curves on the Internet. And again, another 5 years to deploy a more robust version, Curve25519.
Also, there are enough reasons to believe that the NSA and other major agencies around the world have a storage capacity of several hundreds of exabytes. Signal traffic, Tor traffic or OpenPGP traffic today is already being recorded and kept indefinitely, waiting for a usable quantum computer to decrypt them in the future.
BTW, here's a fun fact: although cryptographers are still trying, but there is no known exploitable structure in most symmetric encryption algorithms (such as ChaCha20 or AES) or hash functions. The only known attack is the general attack using the Grover's Algorithm. It means, if you are afraid that the NSA may decrypt your message in the future, you can always use 256-bit symmetric encryption with per-shared keys. For example, WireGuard supports using pre-shared keys to encrypt ECC handshakes as a poorman's version of PQC.
If I understood the WireGuard paper correctly (https://www.wireguard.com/papers/wireguard.pdf), the pre-shared key is not used to encrypt the ECC handshake, or to encrypt anything at all; it's just mixed together with the output from the ECC handshake while deriving the data keys.
Recently Merkle Trees have indeed made significant progress. XMSS is already standardized, has an RFC, and my Linux distro is already shipping OpenSSH with XMSS authentication enabled. But I don't think it's something practical that you want people to start using immediately.
First, the threats to digital signature created by a quantum computer is not as serious as public-key encryption. If the NSA has intercepted my ECDH handshake, they can recover my session key in the future. But if, by 2045, the NSA has managed to forge my signing key, they cannot use it to log into my server, or create false OpenPGP-signed statements. Either the server has been taken offline for a long time, or at least key has already been retired. Unless it's a digital contract of paramount importance (Excluding blockchains, do we really have digital contracts with a validity of more than 20 years?), I don't think it needs post-quantum capabilities today.
Second, XMSS and many other Merkle Trees-based system has a serious limitation - statefulness. If the private key is not updated correct after each signature, and the state is ever reused, the private key is exposed for everyone to forge. It means, starting using XMSS today creates more hazards than not using it.
I do see some reasons to standardize XMSS: (1) Unlike public-key encryption schemes, the security properties of Merkle Trees-based signature schemes are well-known, with strong guarantees. (2) Standardizing it early allows people to be familiar with the standardization process, eases the future process of more PQC standardization, (3) Allow people to implement and deploy it in an experimental setup.
But I really wished to see the standardization of a stateless hash-based signature, like djb's SPHINCS-256.
I'm curious about your takes on this topic.
You as well. I bookmarked your rebuttal to dismissing NIST by default. I had one but yours is way better. I'll just drop that link from now on. :)
"XMSS is already standardized, has an RFC, and my Linux distro is already shipping OpenSSH with XMSS authentication enabled. "
I've been away from crypto for a while. I had no idea this all happened. Thanks for the tip!
"the threats to digital signature created by a quantum computer is not as serious as public-key encryption."
For public-key encryption, I was recommending using several together such as today's strongest plus NTRU and/or McEliece. At the least, one that's strong in classical model plus one that's strong in quantum model. This is for people concerned about governments, large corporations, and organized crime (including cartels). These are the groups most likely to be able to afford and willing to use such codebreaking capabilities on their opponents. Damage can happen even a long time later. That's why U.S. government defaults on around 40 years before declassification of TS data to water down the effects of release.
For signatures, a lot has changed since I last looked at it. The Certificate Transparency schemes might be combined with Merkle schemes in interesting ways. There's also lots of roll-outs on Git/Github which is versioned. People are using smart contracts and blockchains. Then, there's lots of legacy tech with the CA's and DNS system. I think any new ideas about how to develop these alternative techs and get them adopted should be designed to fit within what's going on. I'd probably have to think deeply about all that before I could give you fresh answers on signatures. I'll definitely write it up on these threads if I ever do. Right now, I'm still more focused on getting medium-to-high assurance of design/implementation more cost-effective.
Currently I’m not convinced that we’ve built a decent general quantum computer with more than five. Long way to go.
Your estimate of q=8192 is for logical qubits. The actual physical qubits will realistically go to millions, unless some really fundamental breakthroughs are made.
I don’t think this is true. It is estimated that breaking 2048bits RSA only requires 4000 logical qubits (though the number of physical qubits would be much higher).
And, you also need to remember that NIST seems to have known about "differential cryptanalysis" before everybody else and made DES actually more resistant to it.
"Trust, but verify" should always be the rule.
NIST submitted the AES candidates to the NSA for feedback and the NSA proposed S-box changes without explaining why.
Turns out hose changes made AES more resilient to differential cryptography.
Once upon a time, NIST arranged a public, open competition for a symmetric encryption algorithm, and received several submissions from multiple teams of cryptographers, such as Bruce Schneier (TwoFish), Ron Rivest (RC6), and IBM (Serpent). One scheme was submitted by two cryptographers from Belgium: Vincent Rijmen and Joan Daemen. It survived multiple rounds of reviews by other cryptographers inside and outside the NIST. Finally, Rijndael becomes AES. Although it suffers from minor imperfections, such as timing hazards, or related-key hazard in AES-256. It's universally recognized as the solid standard of symmetric encryption.
Another time, NIST arranged another public competition, it received several submissions from multiple teams of cryptographers. The scheme purposed by the Keccak team, ultimately becomes SHA-3. It doesn't use the classic Merkle–Damgard struction, it's immune from length-extension attacks and arguable more secure than SHA-2, which is designed by the NSA. Yet another time, NIST arranged another public, open competition, and it gave us Argon2, and currently it's considered one of the best password hashing scheme against GPU attacks.
And yet another time, NIST didn't arrange any competition, it gave us Dual_EC. Or the parameters for the NIST curves. But their problems were quickly identified by the public as well. When NIST curves were first announced, it received immediate criticisms on the Usenet. The Dual_EC case was more obvious - even before it was formally published in 2007, the possibility of an asymmetric backdoor was already shown by researchers, yet, despite the criticisms from the public, the NIST still standardized it. The documents from Snowden simply confirms the widespread suspicion.
But when it comes to open competitions, the NIST primarily provides a public platform for researchers, so the U.S. Government could know what is secure by receiving a complete and free cryptanalysis from the best cryptographers. Unless you believe in the conspiratorial view of history, and all the prominent cryptographers around the world are all under secret control of the NSA, there is no reason to believe the results from these open competitions under the scrutiny of public reviews are compromised. Yes, it's possible that the NIST will manipulate the final result during the standardization process, but if it happens, you are free to use the non-standard version.
For example, the standard SHA-3 has different parameters from the original one of the Keccak team. The NIST said it was the tradeoff between acceptable security and performance. Although there is zero evidence of wrongdoings, in case you have reason to believe the standard parameters are backdoored, free to use the original Keccak function, or you can just use BLAKE.
My point is, yes, the final standard may be backdoored, yes, the NSA may manipulate the industry to adopt problematic standards, and yes, sometimes there are reasons to distrust standards and their commercial products. But no, there is no reason to believe that the studies and findings from a public competition itself is harmful or backdoorod. The raw research during the competition is valid and valuable.
And there is even less reason to worry about the PQC competition. Currently, our understanding of Post-Quantum Cryptography is rather limited (some are well-understood and people have strong confidence, such as classical McEliese, or hash-based signatures, but performance is low, keysize is huge, unsuitable for many applications, people are searching for more efficient versions), some algorithms are purposed and broken within months. Around 20 submissions from the first round have already been broken.
I think even the NIST cannot be certain about the security currently... So now it's still in the Research stage. We can start worrying about backdoors in 2022, but probably, not now.
I think it may be a viable alternative in the future. If everyone's having problems with NIST, perhaps we can have more non-governmental competitions organized by the academics and the industry in the future, just like Password Hashing Competition. I believe the CAESAR competition is also non-governmental, and has made great results as well.
The NIST is a known danger - they have earned their reflexive blind distrust. I wouldn't even trust their relative rankings.
Yes, there is a difference. It's always possible that the telephone traffic is being intercepted by some entities, like Room 641A.
On the other hand, a proper random number generator or a public-key encryption algorithm shouldn't even have the theoretical possibility of introducing a backdoor to start with. When cryptographers have published papers that give strong proof that an algorithm is designed to be able to theoretically contain a backdoor, you know something is seriously wrong.
> they have earned their reflexive blind distrust.
Of course, after Snowden has presented his evidence, everybody now understands that everything that designed to be backdoored, will be backdoored. If NIST is purposing a new standard, it will be definitely taken with a grain of salt.
I think I've made the points clear,
> yes, the final standard may be backdoored, yes, the NSA may manipulate the industry to adopt problematic standards, and yes, sometimes there are reasons to distrust standards and their commercial products. But no, there is no reason to believe that the studies and findings from a public competition itself is harmful or backdoorod. The raw research during the competition is valid and valuable.
The vast majority of research is taking place OUTSIDE of the NIST, until the late stage, NIST is merely a forum for everyone, including the NIST people, to submit their comments and attacks. Once we have reached to the final round, the NIST will pick a winner and standardize it. And perhaps add a backdoor.
The point is, if something looks fishy, you don't have to use the one NIST standardized. You can simply choose other finalist, or even create your own, based on the knowledge of secure construction which is shown during the competition. Normally, at this point, the competition has already given enough information about what is secure and what is not.
To summarize, NIST standards may be bad, but the knowledge obtained during the research for the competition is always good.
In 2022, if NIST starts standardize something, then your comment makes perfect sense. But it's largely off-topic for the current moment.