
NIST’s Post-Quantum Cryptography Program Enters ‘Selection Round’ - xoa
https://www.nist.gov/news-events/news/2020/07/nists-post-quantum-cryptography-program-enters-selection-round
======
dependenttypes
Here is the documentation:
[https://nvlpubs.nist.gov/nistpubs/ir/2020/NIST.IR.8309.pdf](https://nvlpubs.nist.gov/nistpubs/ir/2020/NIST.IR.8309.pdf)

All 15 the following are moving to the 3rd round.

Third-Round Finalists:

Public-Key Encryption/KEMs

\- Classic McEliece (Code)

\- CRYSTALS-KYBER (Lattice)

\- NTRU (Lattice)

\- SABER (Lattice)

Digital Signatures

\- CRYSTALS-DILITHIUM (Lattice)

\- FALCON (Lattice)

\- Rainbow (Multivariate)

Alternate Candidates:

Public-Key Encryption/KEMs

\- BIKE (Code)

\- FrodoKEM (Lattice)

\- HQC (Code)

\- NTRU Prime (Lattice)

\- SIKE (Supersingular Elliptic Curve Isogeny)

Digital Signatures:

\- GeMSS (Multivariate)

\- Picnic (Zero-knowledge)

\- SPHINCS+ (Hash)

All of Bernstein's submissions (except the joke one) are in the finalist or in
the alternative list.

A note regarding the alternatives:

> The alternate candidates are regarded as potential candidates for future
> standardization, most likely after another round of evaluation. Some of the
> alternate candidates have worse performance than the finalists but might be
> selected for standardization based on NIST’s high confidence in their
> security. Others have acceptable performance but require additional analysis
> or other work to inspire sufficient confidence in their security for NIST to
> standardize. In addition, some alternate candidates were selected based
> either on NIST’s desire for diversity in future post-quantum security
> standards or on their potential for further improvement.

I am quite a big fan of SPHINCS+, Picnic (these two reduce their security to
the one of their underlying hash functions), and Classic McEliece myself. SIKE
is interesting too but as far as I know it needs to be used interactively.
Rainbow is also interesting.

~~~
hackcasual
The problem with classic mceliece is the size of the public keys, which is
measured in megabytes

~~~
sgillen
Well, at least people are used to downloading Mb of junk when they browse the
internet these days anyway...

~~~
hackcasual
We also want crypto systems that work on embedded/constrained environments

~~~
dependenttypes
And that's fine. These systems can use NTRU or whatever else. These systems
being weak is not a reason to drag everyone else down.

------
nabla9
40 year old Classic McEliece made into the second round. It's well proven but
not fit to all uses due to it's large public key size.

> Classic McEliece has a stable specification−−the only significant change in
> the second round is the addition of additional parameter sets. As such, NIST
> selected Classic McEliece as a finalist and believes it could be ready for
> standardization (should NIST choose to select it) at the end of the third
> round

~~~
badrabbit
I have heard that too, but us it really a big deal to have 65KB+ keys these
days?

~~~
timerol
As someone who works on embedded devices, yes. The whole system can have 256KB
of flash or less. 1 or 2KB for keys hurts, but is manageable. 65KB means that
much of the IoT will not be upgraded to post-quantum security.

~~~
dependenttypes
IoT devices should be behind private networks rather than being directly
accessible to the internet anyway. Said devices can use a "weaker" algorithm
like NTRU.

~~~
bsder
> IoT devices should be behind private networks rather than being directly
> accessible to the internet anyway.

Hogwash. You just reduced my reliability by a dramatic amount. Not only does
my IoT device have to be functional, but it now has to have a _gateway_ that
is functional _at the same time_ in order to be useful. Uh, yeah, no.

And, besides, a private network is only private until one of the devices gets
compromised. Then it's not private anymore.

Better to make your device capable of living on the real, hostile Internet.

~~~
dependenttypes
> Not only does my IoT device have to be functional, but it now has to have a
> gateway that is functional at the same time in order to be useful

Consider it differently. Your IoT stove, IoT coffee maker, IoT washing
machine, IoT lights, etc do not need to be secure if they are behind a secure
gateway.

> And, besides, a private network is only private until one of the devices
> gets compromised. Then it's not private anymore.

The gateway makes sure that nobody but you will be able to access (and thus
compromise) your IoT freezer.

> Better to make your device capable of living on the real, hostile Internet.

Sure, but considering the amount of IoT botnet that is out there I do not
think that this is going to happen.

------
dependenttypes
It should be noted that Bernstein was complaining about the lack of
transparency and NSA involvement in
[https://twitter.com/hashbreaker/status/1285922808392908800](https://twitter.com/hashbreaker/status/1285922808392908800)

~~~
john_alan
Of course, NIST compliance is a circus.

I use it as a list of what not to use.

Once a senior cryptographer in Citi's R&D lab was grilling me on my use of
25519 being without NIST's blessing. I asked him what he would use, he said,
"cryptographically speaking 25519", Ok.jpg.

~~~
dependenttypes
> I use it as a list of what not to use.

Bernstein's (the one that made 25519) own submissions passed to round 3
though. Do you also avoid AES/SHA3/BLAKE because they were NIST finalists?

------
BearsAreCool
One thing that bothers me about most of these algorithms is how conceptually
difficult they are for me to understand. I don't have that strong of a
background in cryptography but there is something special about how with RSA
you can do it at a small scale by hand. With very limited notes I can explain
to a group of high schoolers almost exactly how this vital part of encryption
functions and even vulnerabilities in it (small exponent values, etc). While
rolling your own crypto is bad, with a bit of work I'd expect almost any
programmer to be able to implement RSA without too much difficulty.

I'm not sure how important being able to explain encryption algorithms in
detail to high school students is, but I'll definitely be sad when RSA and
other non-quantum resistant algorithms join the likes of the enigma machine
and exist for demonstration process only. I'm hopeful in time less crypto
minded people (like me) will better understand module learning with errors or
lattice based cryptography and maybe we'll get better at explaining it. It
would be a shame if even more of cryptography into an unintelligible black box
to eke out performance gains.

~~~
SAI_Peregrinus
RSA isn't that easy to understand. It's a trap: it seems simple on the surface
because the core mathematical operation is easy, but the actual complexity
(padding) often gets omitted.

RSA-KEM (key encapsulation mode) is easy. There's no padding, the only thing
is you can't encrypt a message, only a _random_ number < the modulus. Then you
run that number through a Key Derivation Function (KDF) and use it with an
Authenticated Encryption with Associate Data (AEAD) cipher. That's just simple
enough I'd teach it to high-school students.

RSA encryption is hard. It requires OAEP padding, which is by no means simple.
I'd not try to teach it to high-school students.

RSA signing is hard. It requires PSS padding, which is by no means simple. I'd
also not try to teach it to high-school students.

Every other use of RSA is insecure, often in very subtle and hard to
understand ways. Using RSA as a teaching tool about public-key cryptography
does a disservice to students, since they (like you) tend to think that RSA
alone is useful for cryptography.

And if you can treat an operation like the padding as a "black box" and ignore
understanding, then you can understand code-based cryptography the same way:
treat the code as a black box, and the math is very simple.

~~~
tialaramex
I sympathize but don't entirely agree.

The old version 1.5 signature padding remains widespread. Unlike PSS it
doesn't have a security proof reducing to RSA but it does have decades of
successful use in practice in one of the harshest environments (the Web
because the clients merrily run code written by a potential adversary).

"Prepend this fixed data to your hash" which is the central idea of v1.5
padding is definitely something you could teach to high school students. You
can even show them why it's necessary pretty easily for a small exponent.

Making people _check_ padding is again not too hard for high school students,
and I think "Do all the things on the checklist. _All_ of them" is a
worthwhile lesson not just in cryptography.

~~~
SAI_Peregrinus
It _is_ possible to implement 1.5 padding securely. But I'd not say it's easy,
since I'm not aware of any implementation not written by an experienced
cryptographer/cryptographic programmer that's gotten it right on the first try
(no padding oracles or other vulnerabilities). It's conceptually simple, but
still has plenty of footguns. The problem with RSA is all the footguns!
Avoiding them reduces the conceptual simplicity, to the point where you might
as well teach a slightly more complex scheme, like EdDSA or McEliece (with
black-box code).

~~~
tialaramex
The most common mistake in implementations of RSA signature verification seems
to be just plain not validating the padding at all. That does not require a
"experienced cryptographer" it requires actually doing everything on the
checklist, a reflex that's worthwhile in many pursuits.

Historically sometimes people would say "Well nobody would make an error like
that in a modern elliptic curve scheme" and then Microsoft turns out to have
shipped a very popular operating system named "Windows" which didn't do curve
validation - so much for that belief.

Padding oracles aren't a thing for signature verification. You can work this
out for yourself, everybody can do signature verification using the public
key, so if there was a way to do it "badly" that somehow gives away the
private key, you'd do it that way yourself and skip all the effort.

What _does_ exist and might have confused you is an oracle for RSA PKCS#1 v1.5
_decryption_. In this case the party doing decryption knows the private key
and so it makes sense that if this is done poorly they can leak vital
information. It doesn't seem fair to call this a flaw in the signature scheme.

~~~
dependenttypes
> Historically sometimes people would say "Well nobody would make an error
> like that in a modern elliptic curve scheme" and then Microsoft turns out to
> have shipped a very popular operating system named "Windows" which didn't do
> curve validation - so much for that belief.

I am pretty sure that CryptoAPI does not support (or at least did not at the
time) any modern elliptic curve signature scheme, and by modern I am referring
to thins like ed25519.

------
dependenttypes
DJB seems to have a disagreement regarding some statements that NIST made
[https://twitter.com/hashbreaker/status/1288039119805747200](https://twitter.com/hashbreaker/status/1288039119805747200)

"Misrepresentations of security proofs are starting to cause serious damage.
The embarrassingly wrong idea that there's a proof that the "security of
NewHope is never better than that of KYBER" is the centerpiece of NIST
removing NewHope from #NISTPQC. See
[https://nvlpubs.nist.gov/nistpubs/ir/2020/NIST.IR.8309.pdf](https://nvlpubs.nist.gov/nistpubs/ir/2020/NIST.IR.8309.pdf)

Known hybrid attacks are faster against Kyber than against NewHope. (Kyber
missed this because its security analysis was oversimplified; I'm writing a
paper on this.) More to the point, it's clear that there isn't and won't be a
proof that Kyber is at least as secure as NewHope.

Cryptographers who condone exaggerations of what has been proven share
responsibility for any resulting security failures. We cannot simply ignore
the increased influence of dangerously oversimplified "provable security"
claims upon standards and deployment. This is not a game."

------
upofadown
This all seems premature. If someone manages to invent a computer that works
on atomic effects then there will be a lot of development of new algorithms
based on whatever that computer can do.

There is a very good chance that regular classical computing would be greatly
speeded up by such an invention. Just the ability to do things on atom scale
would be a huge breakthrough.

~~~
api
It's entirely possible that QC will arrive but will be more of a special
purpose accelerator for certain algorithms (think of the way custom ASICs can
accelerate specific operations) rather than a general purpose thing. Some form
of general purpose QC may happen eventually but that might take quite a bit
longer, and it may take longer still for it to be commonplace.

In the meantime there will be a strong need for classically computable
cryptographic algorithms that are strong against attack by any known quantum
computable algorithm.

Symmetric encryption (AES, ChaCha) and hashing (SHA, Blake2) are _mostly_ safe
as long as at least 256 bit keys are used and the algorithm doesn't have
weaknesses that QC might open wider. Asymmetric encryption based on
conventional Diffie-Hellman, RSA, or elliptic curve methods is absolutely not
safe. There are known quantum algorithms that _on paper_ can crack them in
reasonable amounts of time if you have the right type and size of quantum
computer, and at this point we have reason to believe that such a quantum
computer can _probably_ be built.

Post-quantum cryptography efforts are about developing asymmetric encryption
(key exchange and signatures) that can be computed on a classical computer but
for which there are no known quantum attacks.

The biggest danger by far is that somebody figures something out and builds
such a quantum computer in secret, allowing them to snoop on all kinds of
secret communication and steal vast amounts of money before anyone figures out
something funny is going on. Unlike an atomic bomb or a new kind of aircraft
or satellite, such a quantum computer could be built and operated without
creating physical effects that anyone could see at a distance. Unless somebody
talks it could remain secret for quite a while.

~~~
The_rationalist
_we have reason to believe that such a quantum computer can probably_ What are
those reasons? Quantum computers could as well be nonsensical bullshit funded
since 40s years

~~~
api
There are a few QC skeptics with various arguments.

If you were the head of NIST (or the NSA), would you be willing to bet the
entire security of your civilian and military communications infrastructure
that these few skeptics are right? There were atomic bomb skeptics too, and it
took Einstein to convince the US government to ignore them and move forward.

Seems silly to make such a high stakes bet against the scientific consensus,
especially if classically computable algorithms that are both classically _and
quantum_ strong can be found and deployed and if doing so is not that
expensive.

Reminds me of Asimov's saying (paraphrasing): "When a distinguished but
elderly scientist says something is possible, they are probably right. When a
distinguished but elderly scientist says it is not, there's a non-trivial
chance they're wrong." In this case quite a few of distinguished elderly
physicists are saying you probably _can_ build a (useful non-toy) quantum
computer. If history is any guide, they're much more likely to be right than
wrong.

Edit: it's not a bad idea to develop new algorithms anyway just to have them
around. We don't think the trap door functions behind current asymmetric
crypto are classically reversible (in any practical amount of compute time),
but there is no mathematical proof of this. It's a strong conjecture that's
held up so far, but it's still a conjecture.

~~~
The_rationalist
Yes, my heuristically rational denial of the possibility of useful quantum
computing does not imply that we should not prepare even for the lowest risks,
especially when they're an existential threat. That being said I do not
understand the need of this NIST competition. I had the believe that current
SHA256/512 _is_ quantum proof. Is that wrong? Why?

~~~
dependenttypes
> That being said I do not understand the need of this NIST competition. I had
> the believe that current SHA256/512 is quantum proof. Is that wrong? Why?

SHA2, SHA3, and AES256 are all quantum proof, yes. This competition is about
asymmetric cryptography though, it replaces non-quantum proof algorithms such
as RSA, DSA, ECC, etc.

------
HashThis
Good

