

Math Advances Suggest RSA Encryption Could Fall Within 5 Years - codesuela
https://www.technologyreview.com/news/517781/math-advances-raise-the-prospect-of-an-internet-security-crisis/

======
Mithrandir
I think this comment from djao has some additional insights:

"It would have been nice if TR had asked an academic researcher for comments.
After all, if discrete logarithms are ever broken, it'll likely be by academic
researchers.

Diffie-Hellman is based on discrete logarithms, but RSA is not. RSA is based
on integer factorization, not discrete logarithms. Integer factorization and
discrete logarithms share some common techniques, but neither is known to be
based on the other, and opinion is split as to whether they are truly
equivalent. I can't think of a single respectable academic researcher who
thinks that the latest results threaten RSA.

The new discrete logarithm attacks require the existence of a nontrivial
intermediate subfield, and work best in small characteristic. They do not
apply to the most common instances of Diffie-Hellman in deployment (no
subfields, large characteristic), and we currently have no realistic
approaches that could make them apply. A similar story actually played out
with ECC some 10-15 years ago, when Weil descent attacks were new. Those
attacks on ECC also require an intermediate subfield, and 10 years of follow-
on research has been unable to sidestep this requirement. To date, there is no
indication that Weil descent can be used on the most common instances of ECC
being deployed today, and nobody is going around saying ECC is at risk from
new algorithms. (Quantum computers, yes, but not new algorithms.)

It is possible that the new discrete logarithm attacks will extend to cases
with large characteristic and no intermediate subfields, but I personally
think that it is extremely unlikely. I would not put the chances at "small but
definite." It would be a major surprise if this happened. Even if it did, RSA
may still be safe. I can't speak for others, but my sense of the crypto
community is that most experts agree with my assessment."

~~~
ColinWright

      > Diffie-Hellman is based on discrete logarithms,
      > but RSA is not.
    

Well, sort of, but not entirely. Yes, the obvious way to break RSA is to
factor the modulus, and then compute the private key from the public key.

But that's not the only thing.

You encrypt by raising the message M to the power of the public key _e_ modulo
the public modulus _n._

    
    
        E = M^e (mod n)
    

That means that

    
    
        log_e(E) = M (mod n)  [ This is wrong - see below ]
    

If you can compute logarithms base _e_ modulo _n_ then given the public
information, you can recover M.

So unrestricted discrete logs let you break RSA.

 _Added in edit:_

Sorry, I mis-spoke myself as I was (and still am) in a hurry. As benmmurphy
correctly points out, this is wrong, but not in an unfixable way. In short:

 _(all done mod n)_

    
    
        E = M^e (mod n)
        log(E) = e.log(M)
        log(E)/e = log(M)
        exp(log(E)/e) = exp(log(M)) = M
    

Again, unrestricted discrete log lets you break RSA.

PS: There's a non-zero chance I screwed up again - feel free to say so!

~~~
benmmurphy
this doesn't look like log to me. i think the function you want is e'th root.

~~~
barrkel
n'th root of x = exp(log(x) / n)

------
ceautery
Isn't there an article like this every year, always long on FUD and short on
math?

~~~
IvyMike
C'mon man, you're better than that. I'm sure you know how to use google to
find the black hat presentation:

[https://www.isecpartners.com/media/105564/ritter_samuel_stam...](https://www.isecpartners.com/media/105564/ritter_samuel_stamos_bh_2013_cryptopocalypse.pdf)

Which you then know how to use to find the papers by Joux.

[http://eprint.iacr.org/2013/095.pdf](http://eprint.iacr.org/2013/095.pdf)
[http://eprint.iacr.org/2013/400.pdf](http://eprint.iacr.org/2013/400.pdf)

A lot of smart and proven people put together this information and if they're
worried, I'm worried.

~~~
diminoten
That's an argument from authority, though. There's no direct evidence to
suggest that such a thing will happen, just that it hasn't been shown to be
impossible.

~~~
IvyMike
It's an article about a tangible step forward in solving DL problems.

Will RSA be broken because of this? Of course nobody knows.

But are the recent breakthroughs interesting "hacker news"? Absolutely. Would
you have really preferred not to hear about this?

Edit: As per tptacek's comment, changed "breaking RSA" to "solving DL
problems"

~~~
tptacek
I wouldn't go so far as to say a tangible step has been made in literally
breaking RSA.

------
beagle3
Anyone know if DLP for a group generated by a primitive polynomial over GF(2)
has had any advances in the last few years?

I haven't seen it referenced anywhere in the last 10 years, but it's a version
of DH that is much easier to implement in hardware than over integers, and the
DLP was (last I checked) believed to be at least as hard, and probably harder,
then the equivalent integer problem.

~~~
pbsd
Yes, that's the kind of logarithm where the breakthroughs have happened, and
the source of all this fuss. Not all logarithms over these fields are affected
equally, though.

Do you know of any actual thing using Diffie-Hellman (or anything else) over
such fields?

------
madaxe
I'd say "switch to ECC", but the fact that the NSA are strong proponents of it
rather makes one wonder why.

~~~
betterunix
The NSA gives pretty good reasons for using ECC, and they are not alone in
supporting it. The cryptography research community is also very supportive of
ECC.

If that does not convince you or if you would prefer systems with security
reductions to worst-case NP-hard problems, you should look here:

[https://en.wikipedia.org/wiki/Learning_with_errors](https://en.wikipedia.org/wiki/Learning_with_errors)

There is a lot of theoretical excitement about LWE right now, not only for
public key encryption and signing systems but also for exotic things like
fully homomorphic encryption and attribute based encryption. Unfortunately,
there is are costs that hamper practical deployment. There keys will be
larger. There are even more parameters to set, and bad choices can be fatal to
security. The security of LWE-based systems is not as well-understood as ECC
(making parameter choices even more difficult). Widely used standards like TLS
and PGP do not have support for lattice / hidden codes systems. High-
performance implementations are still under development.

~~~
madaxe
Aye, aware of LWE, and it's interesting stuff, and understand the theory as to
why ECC is secure, and why DH key exchanges are increasingly not so.

I'm just inherently suspicious of anything the NSA are in favour of, as their
focus seems to be on _breaking_ crypto, rather than recommending strong crypto
- to recommend a key exchange mechanism that they can't snoop on seems to be a
counterintuitive step.

~~~
betterunix
Actually, the NSA both breaks crypto and recommends crypto systems for
government use. Much of their ability to recommend cryptosystems comes from
their expertise in attacking cryptosystems.

This played out in a very interesting way with DES. Most of the theories about
an NSA conspiracy to weaken DES have been falsified. The changes to the s-box
structure was later discovered to strengthen the cipher against a certain
class of attacks. The small key size was later discovered to be right around
the actual security level the cipher provides (larger key sizes would not have
improved security by much).

In the case of public key crypto, one of the most important things we need is
to know what parameter sizes to use. Cryptanalysis is critical to making such
estimates, and once again the NSA's expertise comes in handy here. To put it
another way, if you knew nothing about factoring integers, 1024 bit RSA keys
would appear to be overkill -- the only reason key sizes have become so large
is because of GNFS and similar developments.

In general you should avoid assuming that large, sprawling agencies like the
NSA have a single goal. Yes the NSA conducts signals intelligence and would
prefer that those signals not be encrypted. On the other hand the NSA also
wants to ensure that foreign governments cannot spy on American government
communications. With the vast reliance on contractors to develop software for
sensitive systems there is a need for the NSA to make good recommendations to
the public (even at the risk of improving our opponents' security); it is a
classic NSA dilemma.

~~~
gizmo686
I also believe that the NSA intended to strenghten DES. However, it was later
discovered that the changed they made to the s-boxes also made it weak against
a different form of cryptanalysis.

~~~
tptacek
The NSA's changes to the DES s-boxes made them stronger against differential
cryptanalysis, an observation that betrayed the NSA's prior knowledge of
differential cryptanalysis long before it reached the academic literature.

It's been awhile since I've studied DES, but I'm not aware of any sense in
which the s-box changes weakened it.

~~~
gizmo686
Linear cryptanalysis. Its been awhile since I've studied it as well, but
Applied Cryptography sites a successful attack that took only 50 with 12
workstations (the book was published in 1996).

~~~
tptacek
DES resists linear cryptanalysis --- the best linear attack on DES is 2^43 and
requires 2^43 known plaintexts. Linear cryptanalysis of FEAL-8 takes just 4000
plaintexts.

According to Don Coppersmith, NSA picked DES's s-boxes by randomly generating
them and choosing the ones that best resisted differential cryptanalysis ---
again, this is something NSA did _fifteen years before differential
cryptanalysis was discovered by the public_. They modified DES to resist an
attack _only they knew about_.

------
kenster07
This isn't very different from saying P = NP will be proven within 5 years.
Sounds like linkbait.

~~~
tptacek
This is very different from saying P=NP. It would instead be to say that the
integer factorization or discrete log problems in the specific parameters used
by cryptosystems simply aren't as hard as we once thought they were.

RSA-128 is trivially breakable. Being able to break RSA-128 isn't a
demonstration that P=NP, either.

~~~
YZF
For RSA to be completely broken the attacker must be able to retrieve the
private key for any practical key length.

Some of what people are writing seems to imply more than attacks against
specific parameters but rather some theoretical breakthrough that would render
the scheme completely unusable. (e.g. factoring in polynomial time to key
length)

People have been moving to longer keys over time and thus simply being able to
attack a given key length isn't the same as saying the entire scheme is
broken. Today a 768 bit key is very difficult to attack. Is RSA with 64k-bit
keys likely to be broken within 5 years?

I guess in one sense our ability to work with much larger RSA keys has
progressed faster than the theoretical advances in factoring. Until the time
someone has some practical and provably strong cryptosystem we're kind of
always at risk. ECC could also be broken so why should I feel more comfortable
using ECC over RSA with 64k bit keys?

~~~
tptacek
No competent engineer in the world would give a green light to 768 bit RSA
keys.

This article was based on a presentation I was marginally involved with, and
I'm telling you that the assertions the talk was making were very much key
length dependent. An event that forces a migration from 1024 bit RSA keys
would be painful and dramatic.

~~~
YZF
It also bugged me that different classes of attacks were grouped together to
build a "case" against RSA.

Looking at the slides again:

"There is a small but real chance that both RSA and non-ECC DH will soon
become unusable"

"key sizes may have to up to 16kbit" \- "wildly impractical" (WHY???)

L(1/4) isn't quite enough to make RSA unusable with large enough keys. I ran
the numbers for 64k but you don't need to go as large.

It's not clear that this sort of linear progression presented is real. Some
math problems don't see any progress in a long time and some see big progress
rapidly. Seems to me like it's trying to find a pattern where there isn't one
necessarily.

To conclude. RSA may fall tomorrow. ECC may fall tomorrow. AES may fall
tomorrow. I agree with the principle we need to have some agility built in to
cryptosystems (though if it falls we're kind of screwed). Maybe we need to
_combine_ cryptosystems such that one breaking won't take us down. Added
complexity and changes create attack opportunities (at least current
implementations are battle hardened).

~~~
tptacek
Are you asking why a 16384 bit RSA key is considered wildly impractical? I
assume you understand that using RSA-16384 is silly given the alternatives.

