Hacker News new | past | comments | ask | show | jobs | submit login
NIST Reveals 26 Algorithms Advancing to the Post-Quantum Crypto ‘Semifinals’ (nist.gov)
116 points by pseudolus on Feb 13, 2019 | hide | past | favorite | 55 comments



This talk by djb and Tanja Lange is a good overview of the NIST candidates and the current state of pqcrypto: https://www.youtube.com/watch?v=ZCmnQR3_qWg


this is a must-watch. It puts the self-congratulatory, hyperbolic & optimistic wording of the article headline into much needed perspective.


I’m sorry, what about this headline is “hyperbolic” or “self-congratulatory”? There are 26 algorithms, and they are advancing to another round of consideration. Or are you referring to the claim of “post-quantum”?


I was not referring to PQ but to the headline and content of the article which is written like they achieved something and now deserve a pat on their shoulder ...

phrasing like "NIST reveals" doesn't quite reveal the huge actual shit they're in and their failure to make progress on this. It might even suggest that having such a huge number (26) is in anyway positive (instead of admitting to their incompetence wrt narrowing the number to a manageable few).

Seriously watch DJB/TL's video for some background on what is actually happening.


The names of the 26 candidates can be found at:

https://csrc.nist.gov/news/2019/pqc-standardization-process-...


"For the 2nd round candidates, NIST will allow the submission teams the option of providing updated specifications and implementations (i.e. “tweaks”). The deadline for these tweaks will be March 15, 2019. We originally planned that submission teams would have more time, however recent events out of our control have altered the timeline."

Anyone know what "recent events" refers to?



So the original plan was to announce this earlier then?


I believed they were supposed to be announced at RWC in early January. https://quantumcomputingreport.com/news/nist-to-announce-rou...


The federal shutdown affected the timeline as NIST employees were not allowed to work in January.


Classic McEliece is 40 years old and still kicking.

The problem is the key size: 512 kb for normal use, 8.5 Mb against quantum computers. On the other hand, 1 MB is not that much if you really want the security.


That really is a problem for anything bandwidth constrained, even today.

They're even trying to replace RSA at ~2kb per key.

Part of it is also the human factor. Crypto is way easier if your keys are short enough to write down. You don't really want to be moving keys around on flash drives, it's hard to secure erase files unless the drives themselves are encrypted.


I think any such algo is a non-starter and not very useful.

If we learned anything from the past it's that crypto works and can be deployed if it integrates without users noticing much and if the computational and speed overhead is small (or negative, which in the HTTPS case it often is).


Depends on the application and use.

McEliece ciphertexts are unusually small for post-quantum cryptography. The small ciphertext size can be much more important for total traffic than the large key size.

McEliece is relatively fast and has really fast implementations in hardware, so the speed is less off a problem. It does just binary vector operations. Encapsulation is just one pass.


Why is the computational and speed overhead sometimes negative for HTTPS?


There are performance features that are impractical to deploy unencrypted, so browsers only enable them on HTTPS. (Notably: HTTP/2 and Brotli.)

There are plenty of middlebox devices on the Internet that will drop your traffic if you send anything HTTP over them that they don't understand. So you're kinda "locked in" in not deploying new protocol features. Encryption provides a way out of this.


The scary thing about this is that all current HTTPS encrypted traffic is not quantum proof, meaning that if you put it into long-term storage, you'll be able to decrypt the contents at a later point in time, if quantum computers will get available. By then your contents might still be valuable or even have risen in worth.


No encryption is intended to be permanent. That's an all-but-impossible feat. Encryption exists to protect data for a meaningful length of time. A threat model extending far into the future is naïve.

The Caesar cipher worked until it didn't, the Enigma was secure until it was broken, and DES was good until it wasn't.


I think the parent is trying to highlight an attack vector that most people are unaware of. Most people are happy to see the green lockbox and do not know that it's possible for their browsing habits to be cracked in 30 years when they run for congress.


Yeah, that's the point that I wanted to make. Doubt that you want someone to reveal the private sexting images you sent thirty years ago before you even thought about wanting to run for governor or similar. Content like that still has value decades down the line, but it's only protected with encryption that might be considered offline-attackable at a certain point in time. This affects browsing habits as well as end-to-end stuff like Snapchat, Whatsapp, Signal, ...


The Caesar cipher was based on obfuscation, not on computational complexity. It is not unreasonable to expect humanity to soon have enough knowledge to make unbreakable cyphers. We might very well already have such symmetric cyphers. If you count one time pads then we definitely do.


Shannon's theorem about perfect secrecy says that it is only attainable by ciphers that have similar key lengths to OTP. We can get a lot smarter but that theorem will stay.


Physics gives us some tools to meaningfully talk about "unbreakable" without perfect secrecy. In particular, we can currently reason about "unbreakable on any classical computer operating in an ambiant temperature no less than the cosmic microwave background, and consuming no more energy than exists in the universe".

Physics still has some backdoors (notably reversable computing which, as far as I am aware, has no fundamental limit), but the bigger problem is that complexity theory has not advanced to a point where we can make meaningfull use of this. (The most I have ever seen is computing the lowerbound on energy to enumerate all keys. )

This is probably a solvable problem, but is going to be harder than p vs np.


I was left with the impression that if "good" random number generators exist, then this is not a problem: you use your small key as a seed to make the big key. And we do not have a reason yet to doubt the existence of such RNGs. Did I get this wrong?

Edit: I think I was right, as the mathematical term "perfect secrecy" from the theorem is not particularly practical (lack of perfect secrecy does not imply the existence of a way to break the cypher). http://www.cs.miami.edu/home/burt/learning/Csc609.011/Perfec...


We have no particuarly good reason to believe in the exsitence of a good PRNG. Worse, we have no good reason to trust that any particular PRNG is good.


Good ol symmetric crypto is not easily broken by quantum computers.

It is reasonable to think that we could or have already built a symmetric cipher that may never be broken.


Their key size is effectivly halfed. Not a big deal, but if quantom computers reach near parity with classical, we will need to increase our key sizes.


Sure, that's the what they were intended for. But it's not being used that way, it's used as if the encryption were permanent.


Unbreakable encryption is pretty easy. Just use a one-time pad.


No. Encryption with one-time pads is easy, except for safely sharing the key. Which means that for most situations they are completely impractical. Real-world encryption is a system problem, not just a math problem.


It is indeed prudent to work on post-quantum cryptography now, but no, there's absolutely no quantum computer factoring on the horizon. Any viable path to it, by itself, would be a breakthrough.


True, but as we've seen with RSA et al, algorithms once chosen persist for a long time. It would take time for winning algorithms to trickle down to the libraries, and then time for them to be included in the distros. So, better to get going on this stuff now!


> but no, there's absolutely no quantum computer factoring on the horizon

Recall the history of ECC. ECC was proposed in the late-1990. Unlike RSA, we already had a solid understanding of its security properties at that time. However, it took 15 years to actually implement and deploy NIST curves on the Internet. And again, another 5 years to deploy a more robust version, Curve25519.

Also, there are enough reasons to believe that the NSA and other major agencies around the world have a storage capacity of several hundreds of exabytes. Signal traffic, Tor traffic or OpenPGP traffic today is already being recorded and kept indefinitely, waiting for a usable quantum computer to decrypt them in the future.

BTW, here's a fun fact: although cryptographers are still trying, but there is no known exploitable structure in most symmetric encryption algorithms (such as ChaCha20 or AES) or hash functions. The only known attack is the general attack using the Grover's Algorithm. It means, if you are afraid that the NSA may decrypt your message in the future, you can always use 256-bit symmetric encryption with per-shared keys. For example, WireGuard supports using pre-shared keys to encrypt ECC handshakes as a poorman's version of PQC.


> For example, WireGuard supports using pre-shared keys to encrypt ECC handshakes as a poorman's version of PQC.

If I understood the WireGuard paper correctly (https://www.wireguard.com/papers/wireguard.pdf), the pre-shared key is not used to encrypt the ECC handshake, or to encrypt anything at all; it's just mixed together with the output from the ECC handshake while deriving the data keys.


That's why I pushed Merkle Trees back in the day when the topic came up. There's been work extending them in recent years, too.


Glad to see you again on HN, thanks for your criticism about the security mitigation techniques previously.

Recently Merkle Trees have indeed made significant progress. XMSS is already standardized, has an RFC, and my Linux distro is already shipping OpenSSH with XMSS authentication enabled. But I don't think it's something practical that you want people to start using immediately.

First, the threats to digital signature created by a quantum computer is not as serious as public-key encryption. If the NSA has intercepted my ECDH handshake, they can recover my session key in the future. But if, by 2045, the NSA has managed to forge my signing key, they cannot use it to log into my server, or create false OpenPGP-signed statements. Either the server has been taken offline for a long time, or at least key has already been retired. Unless it's a digital contract of paramount importance (Excluding blockchains, do we really have digital contracts with a validity of more than 20 years?), I don't think it needs post-quantum capabilities today.

Second, XMSS and many other Merkle Trees-based system has a serious limitation - statefulness. If the private key is not updated correct after each signature, and the state is ever reused, the private key is exposed for everyone to forge. It means, starting using XMSS today creates more hazards than not using it.

I do see some reasons to standardize XMSS: (1) Unlike public-key encryption schemes, the security properties of Merkle Trees-based signature schemes are well-known, with strong guarantees. (2) Standardizing it early allows people to be familiar with the standardization process, eases the future process of more PQC standardization, (3) Allow people to implement and deploy it in an experimental setup.

But I really wished to see the standardization of a stateless hash-based signature, like djb's SPHINCS-256.

I'm curious about your takes on this topic.


"Glad to see you again on HN"

You as well. I bookmarked your rebuttal to dismissing NIST by default. I had one but yours is way better. I'll just drop that link from now on. :)

"XMSS is already standardized, has an RFC, and my Linux distro is already shipping OpenSSH with XMSS authentication enabled. "

I've been away from crypto for a while. I had no idea this all happened. Thanks for the tip!

"the threats to digital signature created by a quantum computer is not as serious as public-key encryption."

For public-key encryption, I was recommending using several together such as today's strongest plus NTRU and/or McEliece. At the least, one that's strong in classical model plus one that's strong in quantum model. This is for people concerned about governments, large corporations, and organized crime (including cartels). These are the groups most likely to be able to afford and willing to use such codebreaking capabilities on their opponents. Damage can happen even a long time later. That's why U.S. government defaults on around 40 years before declassification of TS data to water down the effects of release.

For signatures, a lot has changed since I last looked at it. The Certificate Transparency schemes might be combined with Merkle schemes in interesting ways. There's also lots of roll-outs on Git/Github which is versioned. People are using smart contracts and blockchains. Then, there's lots of legacy tech with the CA's and DNS system. I think any new ideas about how to develop these alternative techs and get them adopted should be designed to fit within what's going on. I'd probably have to think deeply about all that before I could give you fresh answers on signatures. I'll definitely write it up on these threads if I ever do. Right now, I'm still more focused on getting medium-to-high assurance of design/implementation more cost-effective.


These competitions also take time. SHA-3's competition was announced in 2006, and the final standard came out in 2015. By comparison, PQC was announced in 2016, and NIST expects the final results to be out in 2022-2024. We're likely a decade away from PQC to be moderately widespread in implementation.


Fair point, but breaking crypto will require quantum computers with millions of qubits.

Currently I’m not convinced that we’ve built a decent general quantum computer with more than five. Long way to go.


Where does that estimate come from? My (basic) understanding is that, in order to find the factorization of a composite N, Shor's algorithm requires an amount of qubits (let call it q) such that N^2 =< 2^q =< 2N^2. Right now RSA is at most at 4096 bits, so q=8192 could be enough.


For building a quantum computer a distinction is made between logical qubits and physical qubits. The physical qubits correspond 1-to-1 with the underlying physical media i.e. each electron spin makes one physical qubit. The physical qubits have a lot of noise, so can't be used directly to do computation. Instead you do error correction on many (3 to millions or more) physical qubits to make one logical qubit that is noise free. These logical qubits are then used to do the actual computation.

Your estimate of q=8192 is for logical qubits. The actual physical qubits will realistically go to millions, unless some really fundamental breakthroughs are made.


If you'd like a specific, authoritative citation confirming what the other responder is saying, read the National Academies 2019 report on QC.


> breaking crypto will require quantum computers with millions of qubits

I don’t think this is true. It is estimated that breaking 2048bits RSA only requires 4000 logical qubits (though the number of physical qubits would be much higher).

Ref: https://security.stackexchange.com/questions/87345/how-many-...


It is true, the parent is talking about physical qubits. By the most recent 2019 estimate (specifically the National Academies report, which was on HN recently, we'll need to increase our current qubit achievements by a factor of at least 10^5. That equates to millions of qubits, because we've currently achieved tens of qubits.


reminder that NIST should not be trusted given their history with regards to standards around vulnerable crypto as well as potential for manipulation due to "higher level" .gov organizations:

https://projectbullrun.org/dual-ec/vulnerability.html


> reminder that NIST should not be trusted given their history with regards to standards around vulnerable crypto as well as potential for manipulation due to "higher level" .gov organizations:

And, you also need to remember that NIST seems to have known about "differential cryptanalysis" before everybody else and made DES actually more resistant to it.

"Trust, but verify" should always be the rule.


NIST didn't do that. The NSA did.

NIST submitted the AES candidates to the NSA for feedback and the NSA proposed S-box changes without explaining why.

Turns out hose changes made AES more resilient to differential cryptography.


Or, "Verify so you can trust".


As my favorite RFC 761 says: "be conservative in what you do, be liberal in what you accept from others."


Blindly distrusting the NIST is as harmful as blindly trusting the NIST.

Once upon a time, NIST arranged a public, open competition for a symmetric encryption algorithm, and received several submissions from multiple teams of cryptographers, such as Bruce Schneier (TwoFish), Ron Rivest (RC6), and IBM (Serpent). One scheme was submitted by two cryptographers from Belgium: Vincent Rijmen and Joan Daemen. It survived multiple rounds of reviews by other cryptographers inside and outside the NIST. Finally, Rijndael becomes AES. Although it suffers from minor imperfections, such as timing hazards, or related-key hazard in AES-256. It's universally recognized as the solid standard of symmetric encryption.

Another time, NIST arranged another public competition, it received several submissions from multiple teams of cryptographers. The scheme purposed by the Keccak team, ultimately becomes SHA-3. It doesn't use the classic Merkle–Damgard struction, it's immune from length-extension attacks and arguable more secure than SHA-2, which is designed by the NSA. Yet another time, NIST arranged another public, open competition, and it gave us Argon2, and currently it's considered one of the best password hashing scheme against GPU attacks.

And yet another time, NIST didn't arrange any competition, it gave us Dual_EC. Or the parameters for the NIST curves. But their problems were quickly identified by the public as well. When NIST curves were first announced, it received immediate criticisms on the Usenet. The Dual_EC case was more obvious - even before it was formally published in 2007, the possibility of an asymmetric backdoor was already shown by researchers, yet, despite the criticisms from the public, the NIST still standardized it. The documents from Snowden simply confirms the widespread suspicion.

But when it comes to open competitions, the NIST primarily provides a public platform for researchers, so the U.S. Government could know what is secure by receiving a complete and free cryptanalysis from the best cryptographers. Unless you believe in the conspiratorial view of history, and all the prominent cryptographers around the world are all under secret control of the NSA, there is no reason to believe the results from these open competitions under the scrutiny of public reviews are compromised. Yes, it's possible that the NIST will manipulate the final result during the standardization process, but if it happens, you are free to use the non-standard version.

For example, the standard SHA-3 has different parameters from the original one of the Keccak team. The NIST said it was the tradeoff between acceptable security and performance. Although there is zero evidence of wrongdoings, in case you have reason to believe the standard parameters are backdoored, free to use the original Keccak function, or you can just use BLAKE.

My point is, yes, the final standard may be backdoored, yes, the NSA may manipulate the industry to adopt problematic standards, and yes, sometimes there are reasons to distrust standards and their commercial products. But no, there is no reason to believe that the studies and findings from a public competition itself is harmful or backdoorod. The raw research during the competition is valid and valuable.

And there is even less reason to worry about the PQC competition. Currently, our understanding of Post-Quantum Cryptography is rather limited (some are well-understood and people have strong confidence, such as classical McEliese, or hash-based signatures, but performance is low, keysize is huge, unsuitable for many applications, people are searching for more efficient versions), some algorithms are purposed and broken within months. Around 20 submissions from the first round have already been broken.

I think even the NIST cannot be certain about the security currently... So now it's still in the Research stage. We can start worrying about backdoors in 2022, but probably, not now.


Good points, just a small correction: Password Hashing Competition was arranged not by NIST, but by Jean-Philippe Aumasson (but we had a cryptographer from NIST on the panel).


Thanks for your correction.

I think it may be a viable alternative in the future. If everyone's having problems with NIST, perhaps we can have more non-governmental competitions organized by the academics and the industry in the future, just like Password Hashing Competition. I believe the CAESAR competition is also non-governmental, and has made great results as well.


Simply confirmed the widespread suspicion? That is an interesting historical retcon. It brings to mind room 641A being "widespread suspicion". It is the difference between theoretical and practical.

The NIST is a known danger - they have earned their reflexive blind distrust. I wouldn't even trust their relative rankings.


> It is the difference between theoretical and practical.

Yes, there is a difference. It's always possible that the telephone traffic is being intercepted by some entities, like Room 641A.

On the other hand, a proper random number generator or a public-key encryption algorithm shouldn't even have the theoretical possibility of introducing a backdoor to start with. When cryptographers have published papers that give strong proof that an algorithm is designed to be able to theoretically contain a backdoor, you know something is seriously wrong.

> they have earned their reflexive blind distrust.

Of course, after Snowden has presented his evidence, everybody now understands that everything that designed to be backdoored, will be backdoored. If NIST is purposing a new standard, it will be definitely taken with a grain of salt.

I think I've made the points clear,

> yes, the final standard may be backdoored, yes, the NSA may manipulate the industry to adopt problematic standards, and yes, sometimes there are reasons to distrust standards and their commercial products. But no, there is no reason to believe that the studies and findings from a public competition itself is harmful or backdoorod. The raw research during the competition is valid and valuable.

The vast majority of research is taking place OUTSIDE of the NIST, until the late stage, NIST is merely a forum for everyone, including the NIST people, to submit their comments and attacks. Once we have reached to the final round, the NIST will pick a winner and standardize it. And perhaps add a backdoor.

The point is, if something looks fishy, you don't have to use the one NIST standardized. You can simply choose other finalist, or even create your own, based on the knowledge of secure construction which is shown during the competition. Normally, at this point, the competition has already given enough information about what is secure and what is not.

To summarize, NIST standards may be bad, but the knowledge obtained during the research for the competition is always good.

In 2022, if NIST starts standardize something, then your comment makes perfect sense. But it's largely off-topic for the current moment.





Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: