
PQProteus – Quantum-resistance for the prekey exchange of Proteus - neongreen
https://github.com/wireapp/pqproteus
======
throwawaymath
There's not a lot of information given here. The blog post is better [1].

Long story short, they decided to augment Wire with experimental post-quantum
security using the NewHope key exchange scheme.

You can read more about NewHope here [2]. It's a lattice-based cryptosystem
using the Ring-Learning With Errors (R-LWE) problem. It's also the same post-
quantum key exchange scheme Google experimented with in Google Chrome [3].
R-LWE is pretty common in state of the art lattice-based cryptosystems (there
are several such, including NewHope, in Round 1 of the NIST PQCrypto CFP [4]).

Among the mathematical "tribes" of post-quantum cryptography, lattice-based
(and code-based) problems are particularly good for speed. On the other hand,
their key sizes are significantly larger (this phenomenon is somewhat inverted
in supersingular isogenies, which offer fantastic key sizes but much slower
key exchange). For those interested in learning more about the learning with
errors problem (and its ring-augmented cousin), the first few pages of the
NewHope specification (and most lattice-based specs from NIST PQCrypto) are a
good brief [5]. And while it's not related to NewHope specifically, Peikert's
survey on lattice-based cryptography is relatively recent and accessible [6].

I'm interested in what impact this will have on latency in Wire. In the
context of the Google Chrome TLS experiment, the median connection latency
increased by 1ms, the slowest 5% increased by 20ms and the slowest 1%
increased by 150ms [7]. The increased connection latency was attributed to the
increase in message size, which in my opinion is pretty interesting in
consideration of the fact that we (generally) consider key size and operation
speed separately.

______________

1\. [https://blog.wire.com/blog/post-quantum-resistance-
wire](https://blog.wire.com/blog/post-quantum-resistance-wire)

2\. [https://newhopecrypto.org](https://newhopecrypto.org)

3\. [https://security.googleblog.com/2016/07/experimenting-
with-p...](https://security.googleblog.com/2016/07/experimenting-with-post-
quantum.html?m=1)

4\. [https://csrc.nist.gov/Projects/Post-Quantum-
Cryptography/Rou...](https://csrc.nist.gov/Projects/Post-Quantum-
Cryptography/Round-1-Submissions)

5\.
[https://newhopecrypto.org/data/NewHope_2017_12_21.pdf](https://newhopecrypto.org/data/NewHope_2017_12_21.pdf)

6\. [https://web.eecs.umich.edu/~cpeikert/pubs/lattice-
survey.pdf](https://web.eecs.umich.edu/~cpeikert/pubs/lattice-survey.pdf)

7\.
[https://www.imperialviolet.org/2016/11/28/cecpq1.html](https://www.imperialviolet.org/2016/11/28/cecpq1.html)

------
jarfil
"fewer than 50 quantum bits"

D-wave's 2000Q released in 2017, is supposed to have 2048 qubits.

~~~
laser
That's marketing-speak from D-wave, though. 72 error-corrected qubits [1] is
state-of-the-art.

[1] [https://www.technologyreview.com/s/610274/google-thinks-
its-...](https://www.technologyreview.com/s/610274/google-thinks-its-close-to-
quantum-supremacy-heres-what-that-really-means/)

~~~
mirimir
That's a great resource. Thanks.

The noise issue does seem problematic. Especially once you're beyond what's
doable classically. I suppose that you could just check for reproducibility.
But maybe there could be systematic problems that wouldn't show up doing that.

But for cracking encryption, the end results seems pretty clear. You either
get sensible plaintext, or you don't. And you can try multiple times.
Systematic effects that consistently yield incorrect plaintext seem unlikely.

Or am I just confused?

~~~
hobls
It’s not always obvious whether you’ve gotten sensible plaintext, especially
if the plaintext is a binary file format you weren’t expecting, or if the
plaintext has been encrypted multiple times.

