
Post-quantum RSA - colinprince
https://eprint.iacr.org/2017/351
======
swordswinger12
THIS PAPER IS:

a nice, thought-provoking paper that is nevertheless _purely speculative_ ,
and also more than a little tongue-in-cheek in parts. (Note that one of the
keywords is "Make RSA Great Again".) It's basically pointing out that there is
a small gap between the costs of legitimately using RSA and factoring via
Shor's algorithm. With truly massive keys, this gap can be exploited to
provide concrete security guarantees for RSA even in the presence of fast
quantum computers.

THIS PAPER IS NOT:

a revolutionary new quantum factoring algorithm. They show how to apply
quantum Grover search to a particular step of a known factorization algorithm
based on elliptic curves.

QUANTUM COMPUTING DOES:

break public-key cryptosystems based on the hardness of factoring and finding
discrete logarithms. Additionally, it gives a quadratic speedup in generic
search problems (via Grover search) which effectively halves the key size of
symmetric encryption schemes and yields faster algorithms for things like
finding collisions in hash functions.

QUANTUM COMPUTING (PROBABLY) DOES NOT:

break all asymmetric cryptography. There is no known proof of this, but it's
widely believed that asymmetric cryptosystems based on different hardness
assumptions are not vulnerable to (non-generic) attacks by quantum computers.

~~~
VLM
"With truly massive keys"

My favorite quote from the paper on that topic is

"Something expensive might nevertheless be useful if it is the least expensive
way to achieve the user’s desired security goal"

Post quantum crypto isn't the end of crypto, is the end of the meme that
nation-state-proof cryptography is "too cheap to meter" (A nuclear power
phrase from the old days)

The pre-quantum landscape is cheap unbreakable algos implemented in easily
breakable ways as a part of easily breakable systems, so there's quite a bit
of criminal activity going on out there. The post-quantum landscape is very
expensive unbreakable algos implemented by professionals because the cost
(computational and financial) is large. The involvement of professionals
indicates that criminal activity based on breaking crypto implementations is
likely to drop.

Its likely that Hollywood style multiple levels of crypto will soon exist. The
days of being able to keep the NSA out of your wifi access point will be over
unless you're willing to spend ridiculous amounts of money. However, for a
very modest expense it'll still be possible to keep your neighbor off your
wifi, wide open to nation states, sure. But my wifi leaching neighbor is a
more realistic adversary than the entire NSA.

~~~
openasocket
I disagree with the premise. There's no reason to assume that we can't
implement a quantum-resistant asymmetric encryption system that is no more
computationally expensive than what we already have.

Hell, there's already work out there
([http://eprint.iacr.org/2014/599.pdf](http://eprint.iacr.org/2014/599.pdf))
for a quantum-resistant key exchange algorithm that's fast enough to have only
21% less throughput than traditional elliptic Diffie-Hellman.

~~~
VLM
Thats a good paper. I thought about it awhile and would retract my remarks of
it being expensive and replace it with being expensive or risky. Also there's
the problem of "what is the definition of post quantum" where a lot of people
call 1995 or so after Shor "PQ", and maybe I'd retract that part of my comment
and call it something like "way way long time in the future, which by whatever
definition of post quantum, is also post quantum, lets call it 3000 AD"

The trick in the big-key RSA paper is knowing how Shor works, because the
average big-O is low for a randomly selected large number, but the maximal
number of ops that could be required is extremely large and yet that number is
"easily" constructible. So merely intentionally and methodically (and
expensively) manufacture a "poison pill" composite that Shor can't possibly
factor because that number was constructed specifically to make Shor algo
fail, and you're RSA system will be good.

The algo in your linked paper is very convenient. However its newer and not as
seasoned and the wolves are snapping at its heels there was that paper around
Thanksgiving last year (which did get retracted) claiming to partially quantum
break LWE.

[https://arxiv.org/abs/1611.06999](https://arxiv.org/abs/1611.06999)

I'd like to see a proof that there exists no possible theoretical quantum
speedup against LWE. Yes I know there isn't one in currently described or
implemented, I mean I can't find a proof that there cannot exist an attack
against LWE. Or, a proof that a non-quantum attack against LWE cannot exist
because in the end that would imply 1=2 or something invalid so by induction
its unbreakable.

In the year 3000 I think the odds are good that algos will still have a large
ratio between the average ops when run against a random number vs the maximal
ops when run against a constructable poison pill designed to require maximum
number of ops. Yet in the year 3000 I would not put faith in any one algo that
hadn't been broken by mid-2017 will never be broken. It seems better to bet on
math as a field continuing to have algos with a wide spread, than to bet on
any one algo.

In that way I think it likely we'll get stuck with something like big key RSA
over a long enough time period.

While at the same time I can agree with you, yes in the post-quantum world
starting in '94 and LWE starting in 2005 there was a post-quantum era from
2005 to at least mid 2017 where LWE was a fast unbroken post-quantum system.
Will that end-date continue to 3000 AD?

Right now I'm thinking about if the Shor trick could be applied to a
theoretically quantum broken RLWE-KEX. Random Discrete Gaussian Sampling is
random... Is it possible to calculate poison pill samples that would mess up a
theoretically broken RLW-KEX (depending a lot on the characteristics of an
imaginary break?) I found slides by Peikert online about how those samples are
selected, which I haven't made sense of yet.

~~~
tptacek
LWE isn't the only problem that's believed to be quantum-hard that can be used
for a reasonably efficient DH primitive. Curve isogenies also appear to work,
and they can take advantage of a tremendous amount of optimization work we've
already done for curve math. There are also other CVP/SVP lattice problems.
And this is sticking to problems that give us the same protocol-level
interface; there are also quantum-hard public key algorithms that don't do
that.

The more important point here, though, is that there's nothing particularly
"expensive" about this stuff. It's all slower than RSA and curves, but (in
particular) curves have gotten much faster over the last 10 years, and
obviously computers are getting faster and silicon offloads less expensive, so
the performance setback here isn't that big.

I'm also not sure why you have so much confidence in the conventional strength
of RSA.

~~~
tptacek
... not, by the way, that I'm willing to stipulate that LWE is all that risky;
the underlying problems in LWE are also very well studied, preceding concerns
about QC by a long time.

------
komali2
> this paper introduces a new quantum factorization algorithm, GEECM,

Am I blind or did they not include the actual algorithm in the paper? I'm not
very knowledgeable in this sort of stuff so to be fair I was skimming for just
a simple writeout of the algorithm, I may have just missed it.

~~~
cwmma
page 7 has it

------
StavrosK
> As part of the attack analysis, this paper introduces a new quantum
> factorization algorithm that is often much faster than Shor's algorithm

Isn't that also huge news, depending on the definition of "often"?

~~~
mr_overalls
IANAC (I am not a cryptographer), but this would seem to be big news, given
the hardware to run it.

I wonder if a quantum computing revolution would cause a resurgence of
research into symmetric encryption algorithms?

Because doesn't Shor's algorithm basically destroy asymmetric encryption
(i.e., integer-factorization schemes like RSA are solvable in polynomial
time), while symmetric encryption is only weakened? Namely, I think I remember
reading that a quantum computer can search through a space of size 2^n in time
2^(n/2) for AES & friends.

~~~
xoa
> _Namely, I think I remember reading that a quantum computer can search
> through a space of size 2^n in time 2^(n /2) for AES & friends._

I believe you're thinking of Grover's algorithm, which as you remember offers
the ability to search for a value in a function domain in O(sqrt(N)). This is
trivially countered just by doubling the key length, a 128-bit key probably
couldn't be considered secure (average 2^64 evaluations) but a 256-bit would
still average 2^128 and 512-bit 2^256. I vaguely recall there was some
speculation that applied de Broglie–Bohm theory, if it was accurate, could
allow a mild improvement but even then it only changed it to cube-root, which
again could be countered with slightly larger keys (in practice 512-bit would
still be plenty even then).

At least for now there don't appear to be any major algorithmic implications
for symmetrical encryption from a theoretical future scalable general purpose
quantum computer. Granted, there may be some practical historical ones, and in
a few cases they could be of significance. If agencies have archived data of
interest encrypted with 128-bit in the past, they may gain the ability to gain
access to it within the next decade or two. A lot of it might be well past its
proverbial sell-by date but not all if those using it were counting on it last
much longer then they'd be alive.

~~~
swordswinger12
There is a lower bound by Bennett et al. ("Strengths and weaknesses of quantum
computing", SIAM JoComputing 1997) that shows Grover's algorithm is optimal
for the generic search problem, so further lowering the asymptotic runtime is
probably impossible.

EDIT: Also note that quantum cryptanalytic attacks may be able to break
_specific_ symmetric ciphers by exploiting some structure specific to the
cipher. But you're correct that QCs don't generically break symmetric
cryptography.

