

A subfield-logarithm attack against ideal lattices - pedro84
http://blog.cr.yp.to/20140213-ideal.html

======
pbsd
This sort of thing is a recurring theme in cryptography:

1\. Someone proposes a scheme based on Hard Problem X. X looks strong, but the
resulting scheme either is too slow or has gigantic keys.

2\. Someone else comes along and proposes a related scheme based on Hard
Problem Y which, having more structure, allows for either smaller keys or
faster computation.

3\. Later turns out this extra structure also helps the attacker.

An example of this phenomena is the McEliece code-based cryptosystem. Many
variants based on alternative codes, attempting to reduce the public key size,
have been proposed over the years, and very few have survived. Another example
is elliptic curves: early on speed was an issue for their practicality, so
many weak curves were also proposed that tried to speed things up (one
particular example was Koblitz's supersingular curve that rendered point
doubling into a linear operation).

Ideal lattices have exacerbated this phenomena by its applications. Lattices
are a key tool in fully homomorphic encryption and friends (multilinear maps,
now also obfuscation), and in the frenzy to get these applications into
practicality ideal lattices (as opposed to unstructured ones) seem to be the
fastest shortcut into better speed and size. It remains to be seen whether
they'll survive.

~~~
yoha
Yes, that's exactly how cryptography research works: first, find and publish a
plausibly secure and practical scheme, then have other researcher try and
break it. If it does not break, it is considered secure.

~~~
andrewcooke
that's not really what he's saying. he's saying that adding structure for
speed also tends to add structure for breaking.

practical, modern crypto (in my limited understanding) seems to be
increasingly about choosing the right "place" to do your maths - somewhere
that's got just enough structure to do the encryption, without enough for the
_known_ attacks to be implemented.

but that has two issues. one, people push the boundary for speed. two, because
you're close to the edge you are arguably more vulnerable to new (incremental)
advances.

it makes you wonder whether all this new-fangled crypto based on clever maths
is such a good idea. maybe the old school block ciphers that simply throw
enough non-commuting, non-linear operations at the problem to make it "too
hard" are smarter than they look. but of course, they tend to be symmetric
only (and i am not sure the distinction above is real; you can try to
formalise why old block cipher systems are hard, and if you do so you come up
with things like aes, and then again you open the door to worrying about
algebraic attacks...)

~~~
yoha
> that's not really what he's saying. he's saying that adding structure for
> speed also tends to add structure for breaking.

I may not have been clear in my reasoning. Modern crypto research does rely on
stress testing newly crafted cryptosystems against peer researshers, who will
try to find a flaw in the system, think of a way of applying an existing
attack, or make a new adapted one. It is to say that published hash function
or block cipher schemes are not necessarily secure; the process of peer-
reviewing before publication only (partially) ensures the good-redaction of
the article and checks for blatant scientific errors. The consequence is then
that every single change to a cryptosystem, like using optimized structures
for speed, memory, sizes of keys, …, changing the key generation algorithm,
using parallelism or even creating an additional functionality using private
data (e.g. the secret key) may result in a partial or total compromise in
security.

So, in clear: yes, using time-optimized structures for crypto mays lead to new
attacks, but this is just a particular case and it does not in any tell that
faster necessarily means weaker, just that crypto is hard, since anything can
go wrong (consider timing attacks for instance).

> it makes you wonder whether all this new-fangled crypto based on clever
> maths is such a good idea. maybe the old school block ciphers that simply
> throw enough non-commuting, non-linear operations at the problem to make it
> "too hard" are smarter than they look. but of course, they tend to be
> symmetric only (and i am not sure the distinction above is real; you can try
> to formalise why old block cipher systems are hard, and if you do so you
> come up with things like aes, and then again you open the door to worrying
> about algebraic attacks...)

This is an usual rant against “post-quantum” crypto. Old schemes were not all
bad, but they were actually impractical when studying their security. Consider
DES, that basically just mangles bits with other bits depending on previous
bits. There is virtually no connection between studying the difficulty of
cracking DES and the rest of research. You will have to look for individual
bit propagation and other dirty stuff (and of course check for obvious attacks
using linear cryptanalysis). Well, the point is that, this way, you only some
public research of the matter. Maybe a few hundreds will seriously try and
find a way to decrypt DES. And they won't dedicate their career to this. This
means that the probability that an effective attack being privately (say, by
the NSA) discovered is not so low.

On the other hand, when you know most attacks against your cryptosystem would
imply the solving of a NP-problem, you can rely on a ton of research material
(mathematics and computer science). The point is that solving _any_ NP-problem
would be a revolution in lots of mathematics fields (especially CS-related).
In other words, you have thousands over thousands of researchers devoting a
lot of their time to try and break your cipher. And the probability that a
lone organization finds a solution first is very low.

Of course, as noted by the original author, you could still have other
attacks, that would not compromise the NP-hardness:

> The same theoreticians also say that lattice-based cryptography has "strong
> provable security guarantees". If this istaken literally then it is false
> advertising. The correct advertising is that a broad class of attacks
> against various attice-based cryptosystems can be converted, with limited
> loss of speed and success probability, into attacks against ertain standard
> lattice problems.

This is exactly what happens when adding "structure" to your keys. Relying on
the NP-problem basically means that what your adversary sees is
indistinguishable from random (when solving a NP-problem, your solution must
work for _all_ instances). So the "mangling" from old crypto is brought back
here. However, this way better, as the part that is to be studied is smaller
and more flexible since we are now considering controlled data and we can make
sure that a mangling brings a good distribution.

The attack which was presented here basically does that. It found a flaw in
the way one lattice-based crypto scheme relied on a NP-problem.

Note: by solving an NP-problem, I mean find a polynomial-time solution to a
question known as "NP-hard"

tl;dr: "this new-fangled crypto" is not about using "clever maths" but having
a more important pool of knowledge to rely on

------
yoha
Bernstein's paper are very interesting and contain strong material. He uses an
excellent theoretical background but keeps practical considerations in sight.
The cryptographic constructions he published are very efficient (e.g. RFSB [1]
is way faster than all other code-based hash functions). I should also add
that this blog entry is a good illustration of how clear his explanations can
be.

[1] see [rfsb] in
[http://cr.yp.to/codes.html#rfsb](http://cr.yp.to/codes.html#rfsb)

------
diziet
For more background on ideal lattice based encryption see
[https://www.cs.cmu.edu/~odonnell/hits09/gentry-
homomorphic-e...](https://www.cs.cmu.edu/~odonnell/hits09/gentry-homomorphic-
encryption.pdf) (pdf).

~~~
darkmighty
Just a note: this paper includes a small introduction to ideal lattice based
encryption, but it's topic (homomorphic encryption) is not necessarily tied to
lattice based crypto; the author has since provided different and more
efficient constructions.

