
NIST Reveals 26 Algorithms Advancing to the Post-Quantum Crypto ‘Semifinals’ - pseudolus
https://www.nist.gov/news-events/news/2019/01/nist-reveals-26-algorithms-advancing-post-quantum-crypto-semifinals
======
sigil
This talk by djb and Tanja Lange is a good overview of the NIST candidates and
the current state of pqcrypto:
[https://www.youtube.com/watch?v=ZCmnQR3_qWg](https://www.youtube.com/watch?v=ZCmnQR3_qWg)

~~~
DyslexicAtheist
this is a must-watch. It puts the self-congratulatory, hyperbolic & optimistic
wording of the article headline into much needed perspective.

~~~
plaidfuji
I’m sorry, what about this headline is “hyperbolic” or “self-congratulatory”?
There are 26 algorithms, and they are advancing to another round of
consideration. Or are you referring to the claim of “post-quantum”?

~~~
DyslexicAtheist
I was not referring to PQ but to the headline and content of the article which
is written like they achieved something and now deserve a pat on their
shoulder ...

phrasing like "NIST _reveals_ " doesn't quite reveal the huge actual shit
they're in and their failure to make progress on this. It might even suggest
that having such a huge number (26) is in anyway positive (instead of
admitting to their incompetence wrt narrowing the number to a manageable few).

Seriously watch DJB/TL's video for some background on what is actually
happening.

------
pseudolus
The names of the 26 candidates can be found at:

[https://csrc.nist.gov/news/2019/pqc-standardization-
process-...](https://csrc.nist.gov/news/2019/pqc-standardization-process-2nd-
round-candidates)

~~~
archgoon
"For the 2nd round candidates, NIST will allow the submission teams the option
of providing updated specifications and implementations (i.e. “tweaks”). The
deadline for these tweaks will be March 15, 2019. We originally planned that
submission teams would have more time, however recent events out of our
control have altered the timeline."

Anyone know what "recent events" refers to?

~~~
zodiac
Maybe the federal shutdown?

[https://www.secureworldexpo.com/industry-news/nist-
governmen...](https://www.secureworldexpo.com/industry-news/nist-government-
shutdown)

~~~
archgoon
So the original plan was to announce this earlier then?

~~~
iancarroll
I believed they were supposed to be announced at RWC in early January.
[https://quantumcomputingreport.com/news/nist-to-announce-
rou...](https://quantumcomputingreport.com/news/nist-to-announce-round-2-pqc-
candidates-on-january-10-2019/)

------
nabla9
Classic McEliece is 40 years old and still kicking.

The problem is the key size: 512 kb for normal use, 8.5 Mb against quantum
computers. On the other hand, 1 MB is not that much if you really want the
security.

~~~
hannob
I think any such algo is a non-starter and not very useful.

If we learned anything from the past it's that crypto works and can be
deployed if it integrates without users noticing much and if the computational
and speed overhead is small (or negative, which in the HTTPS case it often
is).

~~~
graedus
Why is the computational and speed overhead sometimes negative for HTTPS?

~~~
hannob
There are performance features that are impractical to deploy unencrypted, so
browsers only enable them on HTTPS. (Notably: HTTP/2 and Brotli.)

There are plenty of middlebox devices on the Internet that will drop your
traffic if you send anything HTTP over them that they don't understand. So
you're kinda "locked in" in not deploying new protocol features. Encryption
provides a way out of this.

------
est31
The scary thing about this is that all current HTTPS encrypted traffic is not
quantum proof, meaning that if you put it into long-term storage, you'll be
able to decrypt the contents at a later point in time, if quantum computers
will get available. By then your contents might still be valuable or even have
risen in worth.

~~~
Fej
No encryption is intended to be permanent. That's an all-but-impossible feat.
Encryption exists to protect data for a meaningful length of time. A threat
model extending far into the future is naïve.

The Caesar cipher worked until it didn't, the Enigma was secure until it was
broken, and DES was good until it wasn't.

~~~
pradn
I think the parent is trying to highlight an attack vector that most people
are unaware of. Most people are happy to see the green lockbox and do not know
that it's possible for their browsing habits to be cracked in 30 years when
they run for congress.

~~~
est31
Yeah, that's the point that I wanted to make. Doubt that you want someone to
reveal the private sexting images you sent thirty years ago before you even
thought about wanting to run for governor or similar. Content like that still
has value decades down the line, but it's only protected with encryption that
might be considered offline-attackable at a certain point in time. This
affects browsing habits as well as end-to-end stuff like Snapchat, Whatsapp,
Signal, ...

------
sanxiyn
It is indeed prudent to work on post-quantum cryptography now, but no, there's
absolutely no quantum computer factoring on the horizon. Any viable path to
it, by itself, would be a breakthrough.

~~~
CaliforniaKarl
True, but as we've seen with RSA et al, algorithms once chosen persist for a
long time. It would take time for winning algorithms to trickle down to the
libraries, and then time for them to be included in the distros. So, better to
get going on this stuff now!

~~~
segfaultbuserr
> but no, there's absolutely no quantum computer factoring on the horizon

Recall the history of ECC. ECC was proposed in the late-1990. Unlike RSA, we
already had a solid understanding of its security properties at that time.
However, it took 15 years to actually implement and deploy NIST curves on the
Internet. And again, another 5 years to deploy a more robust version,
Curve25519.

Also, there are enough reasons to believe that the NSA and other major
agencies around the world have a storage capacity of several hundreds of
exabytes. Signal traffic, Tor traffic or OpenPGP traffic today is already
being recorded and kept indefinitely, waiting for a usable quantum computer to
decrypt them in the future.

BTW, here's a fun fact: although cryptographers are still trying, but there is
no known exploitable structure in most symmetric encryption algorithms (such
as ChaCha20 or AES) or hash functions. The only known attack is the general
attack using the Grover's Algorithm. It means, if you are afraid that the NSA
may decrypt your message in the future, you can always use 256-bit symmetric
encryption with per-shared keys. For example, WireGuard supports using pre-
shared keys to encrypt ECC handshakes as a poorman's version of PQC.

~~~
nickpsecurity
That's why I pushed Merkle Trees back in the day when the topic came up.
There's been work extending them in recent years, too.

~~~
segfaultbuserr
Glad to see you again on HN, thanks for your criticism about the security
mitigation techniques previously.

Recently Merkle Trees have indeed made significant progress. XMSS is already
standardized, has an RFC, and my Linux distro is already shipping OpenSSH with
XMSS authentication enabled. But I don't think it's something practical that
you want people to start using immediately.

First, the threats to digital signature created by a quantum computer is not
as serious as public-key encryption. If the NSA has intercepted my ECDH
handshake, they can recover my session key in the future. But if, by 2045, the
NSA has managed to forge my signing key, they cannot use it to log into my
server, or create false OpenPGP-signed statements. Either the server has been
taken offline for a long time, or at least key has already been retired.
Unless it's a digital contract of paramount importance (Excluding blockchains,
do we really have digital contracts with a validity of more than 20 years?), I
don't think it needs post-quantum capabilities today.

Second, XMSS and many other Merkle Trees-based system has a serious limitation
- statefulness. If the private key is not updated correct after each
signature, and the state is ever reused, the private key is exposed for
everyone to forge. It means, starting using XMSS today creates more hazards
than not using it.

I do see some reasons to standardize XMSS: (1) Unlike public-key encryption
schemes, the security properties of Merkle Trees-based signature schemes are
well-known, with strong guarantees. (2) Standardizing it early allows people
to be familiar with the standardization process, eases the future process of
more PQC standardization, (3) Allow people to implement and deploy it in an
experimental setup.

But I really wished to see the standardization of a stateless hash-based
signature, like djb's SPHINCS-256.

I'm curious about your takes on this topic.

~~~
nickpsecurity
"Glad to see you again on HN"

You as well. I bookmarked your rebuttal to dismissing NIST by default. I had
one but yours is way better. I'll just drop that link from now on. :)

"XMSS is already standardized, has an RFC, and my Linux distro is already
shipping OpenSSH with XMSS authentication enabled. "

I've been away from crypto for a while. I had no idea this all happened.
Thanks for the tip!

"the threats to digital signature created by a quantum computer is not as
serious as public-key encryption."

For public-key encryption, I was recommending using several together such as
today's strongest plus NTRU and/or McEliece. At the least, one that's strong
in classical model plus one that's strong in quantum model. This is for people
concerned about governments, large corporations, and organized crime
(including cartels). These are the groups most likely to be able to afford and
willing to use such codebreaking capabilities on their opponents. Damage can
happen even a long time later. That's why U.S. government defaults on around
40 years before declassification of TS data to water down the effects of
release.

For signatures, a lot has changed since I last looked at it. The Certificate
Transparency schemes might be combined with Merkle schemes in interesting
ways. There's also lots of roll-outs on Git/Github which is versioned. People
are using smart contracts and blockchains. Then, there's lots of legacy tech
with the CA's and DNS system. I think any new ideas about how to develop these
alternative techs and get them adopted should be designed to fit within what's
going on. I'd probably have to think deeply about all that before I could give
you fresh answers on signatures. I'll definitely write it up on these threads
if I ever do. Right now, I'm still more focused on getting medium-to-high
assurance of design/implementation more cost-effective.

------
maximente
reminder that NIST should not be trusted given their history with regards to
standards around vulnerable crypto as well as potential for manipulation due
to "higher level" .gov organizations:

[https://projectbullrun.org/dual-
ec/vulnerability.html](https://projectbullrun.org/dual-ec/vulnerability.html)

~~~
bsder
> reminder that NIST should not be trusted given their history with regards to
> standards around vulnerable crypto as well as potential for manipulation due
> to "higher level" .gov organizations:

And, you also need to remember that NIST seems to have known about
"differential cryptanalysis" before everybody else and made DES actually
_more_ resistant to it.

"Trust, but verify" should _always_ be the rule.

~~~
admax88q
NIST didn't do that. The NSA did.

NIST submitted the AES candidates to the NSA for feedback and the NSA proposed
S-box changes without explaining why.

Turns out hose changes made AES more resilient to differential cryptography.

------
dlgeek
Dupe:
[https://news.ycombinator.com/item?id=19037286](https://news.ycombinator.com/item?id=19037286)

