
NIST Post-Quantum Cryptography Round 1 Submissions - sohkamyung
https://csrc.nist.gov/Projects/Post-Quantum-Cryptography/Round-1-Submissions
======
Nokinside
If you can tolerate ~1MB private keys, Classic McEliece is probably the best
conservative and secure post-quantum option just now before others are
analyzed more. It's old and proven.

>9 Advantages and limitations (2.B.6)

>The central advantage of this submission is security. See the design
rationale.

>Regarding efficiency, the use of random-looking linear codes with no visible
structure forces public-key sizes to be on the scale of a megabyte for
quantitatively high security: the public key is a full (generator/parity-
check) matrix. Key-generation software is also not very fast. Applications
must continue using each public key for long enough to handle the costs of
generating and distributing the key.

>There are, however, some compensating efficiency advantages. Encapsulation
and decapsu- lation are reasonably fast in software, and impressively fast in
hardware, due to the simple nature of the objects (binary vectors) and
operations (such as binary matrix-vector multiplications). Key generation is
also quite fast in hardware. The hardware speeds of key generation and
decoding are already demonstrated by our FPGA implementation. Encapsulation
takes only a single pass over a public key, allowing large public keys to be
streamed through small coprocessors and small devices.

>Furthermore, the ciphertexts are unusually small for post-quantum
cryptography: under 256 bytes for our suggested high-security parameter sets.
This allows ciphertexts to fit comfortably inside single network packets. The
small ciphertext size can be much more important for total traffic than the
large key size, depending on the ratio between how often keys are sent and how
often ciphertexts are sent. System parameters can be adjusted for even smaller
ciphertexts.

~~~
hackcasual
Can't really see standardization settling on something that requires
~1,000-10,000x overhead from its competitors. 1mb public keys would mean
things like IoT devices and block chain type usages, not to mention local
caching taking possibly gigabytes.

Hybrid secret-sharing schemes do a lot to mitigate both future quantum risk
and present broken cipher system risk.

------
bascule
One of them has already fallen victim to a key recovery attack:

[https://twitter.com/yx7__/status/945283780851400704](https://twitter.com/yx7__/status/945283780851400704)

Round2 (LWE-based) and SIKE (isogeny-based) are the particularly interesting
ones to me. Both support comparatively small keys (~1kB), with Round2
seemingly winning on performance, but also patented.

~~~
mort96
I never understood how or why patenting math is a thing.

~~~
garmaine
It is patenting the application of math.

~~~
mort96
How do you differentiate between math and the "application of math"?

~~~
garmaine
You can't patent an S-box transform. You can patent using a specific S-box
transform to encrypt / decrypt data. That's the justification at least.

~~~
betterunix2
More like you can patent math when it is applied via a computer. So you would
not be violating the patent by evaluating the S-box with pen and paper.

It makes no particular sense, but the theory behind math patents being out of
bounds is that math is a fact. Software patents, including crypto patents, do
not "feel" like facts to most judges, even if they technically are (see:
Curry-Howard Correspondence).

------
exabrial
I like to ask dumb questions... So here is one: Are forward secrecy guarantees
made by TLS valid in a post-quantum world?

Specifically, say I connect via TLS to a server that uses FS and exchange some
information that will be valuable in 150 years. If the entire conversation is
recorded at the wire, does quantum computing break this promise? (or have I
misunderstood FS completely?)

~~~
modalduality
In general forward security involves generating a new key for each
conversation with Diffie-Hellman or similar.

According to [https://crypto.stackexchange.com/questions/5610/diffie-
hellm...](https://crypto.stackexchange.com/questions/5610/diffie-hellman-is-
post-quantum-secured), Diffie-Hellman is not post-quantum secure, but
[https://en.wikipedia.org/wiki/Supersingular_isogeny_key_exch...](https://en.wikipedia.org/wiki/Supersingular_isogeny_key_exchange)
is a DH alternative that is.

~~~
jgrahamc
Which is partly why we wrote and open sourced a SIDH implementation in TLS
1.3: [https://blog.cloudflare.com/sidh-go/](https://blog.cloudflare.com/sidh-
go/)

~~~
newman314
Do you have any performance numbers for SIDH vs ECDHE?

~~~
grittygrease
Cloudflare has a Go implementation of SIDH with p751:
[https://blog.cloudflare.com/sidh-go/](https://blog.cloudflare.com/sidh-go/)

Here’s an overview of the performance from a patch by Armando Faz Hernandez:
[https://github.com/cloudflare/p751sidh/pull/2](https://github.com/cloudflare/p751sidh/pull/2)

------
late2part
I'm seriously considering moving to pre-shared keys. Vernor Vinge's books (_A
Deepness In The Sky_, others) talk about how much of interstellar shipping is
the secure transportation of random pre-shared keys.

Although I'm unfamiliar w/ anything more sophisticated than a password manager
like 1Password and S/Key - I'd like to see if I could have a go with this for
my personal kit.

Can anyone recommend keywords/papers that talk about pre-shared key
authentication at automated scale?

~~~
CiPHPerCoder
PSK isn't really a good solution, mostly because of scaling problems.

Today, all you need is a 32-byte ECDH private key, and a corresponding 32-byte
public key. To talk to any of the billions of devices on the Internet
securely, all you need is their 32-byte public key, and some means to verify
that it is authentic (i.e. certificate authorities + certificate
transparency). With this information, you can calculate a shared secret (or
use a sealing API given only their public key, if you really want).

Total local storage: PKI overhead plus your own keys, under a few megabytes.

Doing PSK at scale means storing billions of 32-byte binary strings. One for
each participant you intend to communicate with. They must also do this for
every possible device they wish to communicate with, otherwise others can just
eavesdrop on their conversations if keys are reused.

You've turned an O(n) space problem into an O(n^2) space problem.

If your n is small, you might still be tempted to make the move, but it will
quickly become unmanageable as n grows.

Furthermore, if a devices loses their key, you must:

1\. Generate/share a new keypair with n devices (which at Internet scale is
billions), and

2\. Do so in a way that doesn't open the door for backdoors, which probably
requires digital signatures and therefore you're not actually getting rid of
public-key cryptography

I think at Internet scale, we're still years away from having reliable post-
quantum cryptography to even recommend let alone adopt, but that's going to be
the way to go. Something something baby and bathwater.

~~~
colmmacc
Passwords are PSKs and are the overwhelming default authentication mechanism
for Internet scale systems.

Storing, distributing, and synchronizing billions of small blobs is actually a
solved problem and not that big a deal. I wouldn't be so quick to write it
off, because it has a real, working, easily understood invalidation and
revocation story, which PKI does not.

A system that combines asymmetric crypto public-private key pairs with the
distributed-whitelist properties of simple passwords, by treating the public
keys as PSKs, has a lot of compelling benefits.

~~~
CiPHPerCoder
> Passwords are PSKs and are the overwhelming default authentication mechanism
> for Internet scale systems.

Not for HTTPS, which is what we're talking about.

For WPA2-PSK, okay, you have a small n, it might work.

For user authentication against a local database, you're not encrypting,
you're using validating against a password hash and then elevating the user's
privilege for the duration of their session. That's far removed from what we
mean when we say PSK (pre-shared keys), where _encryption is implied_.

~~~
late2part
To be fair, that’s what you’re talking about. I am mostly talking about SSH
access.

------
Klathmon
Does NIST require submissions to be free (as in open software)?

Or is this going to be a bunch of advertising for proprietary patented
algorithms?

~~~
IncRnd
This is to form PQC (post quantum crypto) algorithms driven by the industry.
In practice, as long as quantum computers don't both work and become scalable
in the interim, you have another 4 years to worry about recommendations.

~~~
mulmen
That's not really true though is it? What about any of my data that is
encrypted today with a non-PQC algorithm and has been intercepted? If quantum
computers become functional and scalable while that data is still of value to
me then I need to worry.

~~~
IncRnd
Yes, it's true. I spoke about future pqc recommendations from NIST at a
scheduled time. I never said there was nothing to worry about, but that NIST's
future pqc recommendations don't need to be worried about today.

You are making a reference to something other than the pqc future
recommendations from NIST. Of course, if the agility is low, requiring a large
ramp up time, a company can turn to the CNSA values today.

As you point out, working scalability is the issue. Even when there is a
computer than can run grover or schor over something, the number of qubits
will be the issue.

------
yalogin
Are there any blogs that talk about what’s required in a crypto algorithm to
withstand quantum computing? Is a straightforward increasing the key size
enough? Or are there new paradigms explored? Are these algorithms symmetric or
asymmetric?

~~~
betterunix2
PQ crypto has to be secure against a _scalable_ quantum computer i.e. one that
can be increased to attack arbitrarily large keysizes. Building a quantum
computer with enough qbits to attack RSA2048 is not a problem if there is no
known way to increase the number of qbits to attack RSA3072 -- but if you can
keep adding more qbits, then we need a post-quantum algorithm.

The interest in post-quantum crypto is mainly in public-key systems, because
for symmetric key systems you can essentially double the key size to achieve
the same security level. In other words, AES256 is as secure against a quantum
computer as AES128 is against a classical computer. On the other hand, quantum
algorithms break RSA and DLOG (include ECDLOG), so the research has been on
new approaches to public key crypto that _seem_ hard for quantum computers.
For example, the Learning With Errors (LWE) problem:

[https://en.wikipedia.org/wiki/Learning_with_errors](https://en.wikipedia.org/wiki/Learning_with_errors)

~~~
yalogin
Interesting. But an ec 512 bit key has a theoretically same strength as an aes
256 bit key. How is aes able to withstand it when ec and rsa can’t?

~~~
CiPHPerCoder
AES's security reduction in the quantum scenario comes from a sqrt(n)
reduction in search time. It's an optimized brute force against the key.

ECC gets broken because there is a quantum algorithm that solves the Elliptic
Curve Discrete Logarithm Problem much faster than sqrt(n).

~~~
yalogin
Thanks. Looks like I have a wrong understanding of quantum computing. I
thought QC increases the computing power by a lot there by making it easier or
practical to compute the discrete log. So are you saying that the QC paradigm
lends itself to a different class of algorithms?

~~~
CiPHPerCoder
Yes, look up Shor's Algorithm and Grover's Algorithm.

------
algorithmsRcool
This seems to have generated a lot of collaboration.

DJB is listed on 5 separate submissions.

Phillippe Gaborit is on 8!

------
maxharris
If I were you, I wouldn't be so quick to trust any cryptography standard set
by a government agency. They have an obvious incentive to ensure that _any_
standard that is eventually set be open to attack by whatever means the NSA
has (or will have in the near future.)

So pay careful attention to things they _reject_ , or things they modify!

I have no insider knowledge here. I'm just applying common sense...

------
Sami_Lehtinen
Dupe:
[https://news.ycombinator.com/item?id=15981884](https://news.ycombinator.com/item?id=15981884)

~~~
dang
I guess we'll leave this one up since that one didn't attract discussion while
this one has.

------
AgentME
I sure hope we can get a post-quantum version of Dual_EC_DRBG from the
trustworthy people at NIST. /s

Not that I especially think any of these algorithms are going to be backdoored
(though I certainly hope they get extra scrutiny just in case). I'm mostly
just disappointed that there aren't any real consequences for a group like
NIST for rubber-stamping backdoored things.

