
NSA infiltrated RSA security more deeply than thought: Study - Libertatea
http://www.reuters.com/article/2014/03/31/uk-usa-security-nsa-rsa-idUKBREA2U0U620140331
======
tptacek
The extended_random_value extension is by itself innocuous; in fact, in its
historical context (2008), it's just on the edge of plausibly beneficial.

All extended_random_value does is add, to the random_bytes value in the TLS
handshake, another variable-length blob of random data. The ostensible idea
behind the extension is that there are protocols that want randomness
proportional to key sizes, and the 28 random bytes already in the handshake
are insufficient.

That is _all_ the extension does. It doesn't influence your random number
generator, it doesn't change the crypto algorithms in TLS.

Now, extended_random_value is _not_ innocuous in 2014. There's a very good
chance that the PKRNGs (Dual_EC and the like) are intended as cryptographic
backdoors, and modern sensibilities about crypto protocols dictate that both
sides of a protocol should be careful about "showing all their cards" (as it
were) when it comes to random number generators. It's now thought to be risky
to disclose too much state.

But that was definitely not the attitude in place in the early 2000s --- more
randomness was better.

Finally, it's worth knowing that nobody uses extended_random_value. Even fewer
people were exposed to it than were exposed to Dual_EC (which very few apps
were --- so few that, from what I can tell, nobody has ever named such an app
on HN [companies using BSafe do not equate to apps using Dual_EC, much less
exploitably]). Even the Internet Draft seems quizzical about why the extension
would exist, and it more or less says "you'd use this if your TLS needed to be
compatible with the DoD".

PS: It's been a few months since I looked, but interesting to note: the
mailing list discussions about extended_random_value were also skeptical about
the intention behind the extension. Rescorla more or less said (IIRC) "the USG
asked me to specify this so they could use TLS". Nobody on the TLS WG ever
told an implementor that they should adopt this.†

PPS: Whatever 'pbsd says that conflicts with this comment, believe the 'pbsd
comment, not mine. :)

† _I 'm a little wrong here; the mailing list discussion I'm thinking of is
about OpaquePRF. But OpaquePRF is basically the same concept, and by the same
authors; extended_random_value is a refinement. Neither was standardized._

~~~
acqq
The draft (the last version I've found is March 2009):

[http://tools.ietf.org/html/draft-rescorla-tls-extended-
rando...](http://tools.ietf.org/html/draft-rescorla-tls-extended-random-02)

(The first seems to be "Opaque PRF Inputs for TLS" from December 2006.
[http://tools.ietf.org/html/draft-rescorla-tls-opaque-prf-
inp...](http://tools.ietf.org/html/draft-rescorla-tls-opaque-prf-input-00) so
the process lasted more than two years)

The relevant parts from the last:

    
    
       TLS [I-D.ietf-tls-rfc4346-bis] and DTLS [RFC4347] use a 32-byte
       "Random" value consisting of a 32-bit time value time and 28 randomly
       generated bytes:
    
             struct {
                uint32 gmt_unix_time;
                opaque random_bytes[28];
             } Random;
    
       The United States Department of Defense has requested a TLS mode
       which allows the use of longer public randomness values for use with
       high security level cipher suites like those specified in Suite B
       [I-D.rescorla-tls-suiteb].
    
       This document defines a new TLS extension called "extended_random".
    
       The "extended_random" extension carried in a new TLS extension called
       "ExtendedRandom".
    
            struct {
                opaque extended_random_value<0..2^16-1>;
            } ExtendedRandom;
    

Note how they used the notation that casually looks smaller as in "there's 2
and 16 and 1" when they actually mean up to 65535 bytes (I've first thought
it's fixed but it looks the length is a subject of negotiation).

Roughly 2300 times more than the default 28 bytes.

TPtacek says: "It's now thought to be risky to disclose too much state. But
that was definitely not the attitude in place in the early 2000s --- more
randomness was better."

I can't imagine that there was ever an attitude that it's not risky "to
disclose too much state."

~~~
tptacek
I'm not sure what you're trying to say here that my comment doesn't already
say.

It might be helpful to note that there are crypto people --- some of the
authors on the recent Dual_EC exploit paper included --- that believe TLS
already discloses too much random state, even without the TLS extension.

I think it's uncontroversial in 2014 to say that _any_ more state disclosure
--- 2 bytes, 200 bytes, 16k bytes --- is probably not a good thing.

It might be a little controversial to say this next thing, but I'll stand by
it for the sake of an interesting argument: in 2005, when this discussion
started, it was _not_ known to be a bad thing to increase the amount of random
state in the handshake.

~~~
acqq
As Thomas is not sure what I'm "trying to say here that" his "comment doesn't
already say," he actually wrote: "All extended_random_value does is add, to
the random_bytes value in the TLS handshake, another variable-length blob of
random data."

It sounds _smallish "just another random-data blob."_ It's actually _huge_ "up
to additional 65365 bytes from the generator added to the default 28" before
the encryption even starts.

Thomas also wrote "The ostensible idea behind the extension is that there are
protocols that want randomness proportional to key sizes."

The biggest ECC key is 571 bits which fit in just 72 bytes and this thing
provides up to 65535 bytes. Again, fully different impression. I quoted the
draft to show the actual number. The numbers matter here. Hadn't I looked at
the draft directly, I'd get the fully different impression only reading
Thomas' comment.

~~~
tptacek
Incidentally, here are the original requirements that these drafts were
intended to satisfy:

    
    
       In a number of United States Government applications, 
       it is desirable to have some material with the following 
       properties:
    
       1.  It is contributed both by client and server.
       2.  It is arbitrary-length.
       3.  It is mixed into the eventual keying material.
       4.  It is structured and decodable by the receiving party.
    

_(Amusingly, this comment got downvoted.)_

~~~
acqq
That is what they appeared to satisfy then (2006-2009). We know now that if
implemented it would allow them almost real-time or even real-time decryption
of communication of whoever uses it, as we discussed here:

[https://news.ycombinator.com/item?id=7503287](https://news.ycombinator.com/item?id=7503287)

We also know extended_random_value is _actually implemented in RSA 's BSAFE_
library, as the paper
[http://dualec.org/DualECTLS.pdf](http://dualec.org/DualECTLS.pdf) shows.

And we know [http://security.stackexchange.com/questions/43164/which-
prod...](http://security.stackexchange.com/questions/43164/which-products-are-
affected-by-nsas-ability-to-crack-pseudo-random-number-gener)

"the RSA BSAFE library uses Dual_EC_DRBG (...) by default (...)

I can easily find (hint: use your favourite search engine to search for the
terms "This product includes" "RSA BSAFE") implementations, oddly skewed
towards imaging and gaming devices: surprisingly many printer/copier/fax
devices use BSAFE, though for unknown purposes.

Including Ricoh, Minolta, Océ/Canon, Brother, Fuji/Xerox, Epson ...

Your Playstation (PDF), PSP, or your Nintendo DS wifi (PDF)

Software from Adobe, Hitachi, Oracle and HP Some Nokia phones(PDF)"

~~~
tptacek
Which of those products do you know to actually be using Dual_EC at all?

~~~
acqq
At least, every product which uses defaults of the library? You know, "leave
the hard choices to the experts who know more."

~~~
tptacek
Can we name _one_ of them that uses Dual_EC, and has been shown to do so? I
know a little about BSafe, and my impression is that most programs that use it
don't use it to get TLS, which their OS already provides. Feel free to correct
me if you've got more BSafe experience than I do.

This is obviously a rehash of an argument that recurs on HN a lot, but just to
be fair to what I'm trying to say here: that argument has (as far as I can
tell) never been resolved.

~~~
acqq
So who needs, apart from the spies from other countries and TPtacek more
precise list of the possibly vulnerable products than the one already here? I
don't think many people see the benefits in making the list more precise. The
companies mentioned have all the incentives to remain silent.

~~~
tptacek
All things being equal I'd rather know more than less.

Remember, I'm not asking how many vendors keep the BSAFE default. I'm asking
how many use BSAFE in a way where the RNG matters.

~~~
acqq
Isn't your company's business to discover the security flaws? I believe the
fact that you don't know the details means that there isn't much interest, at
least in your part of the world, to fund such a research. There are no
incentives for academia to make it too. All that doesn't mean the list doesn't
exist, only that there is no economic reason to publish it.

~~~
tptacek
No. My company provides an engineering service for vendors to find security
flaws in specific pieces of software before (and sometimes after) that
software ships. You should think of us as part of the development process.

If you gave us a piece of software that used BSAFE, you could reasonably
expect detailed mitigation advice on not having a PKRNG exposed in your
system.

No process in the market compensates us for acquiring a Ricoh printer and
digging into its firmware to find how any why it's using BSAFE. It's not an
especially interesting research project either.

I'm not surprised we don't know which products actually use Dual_EC, and I
don't find the fact that we don't particularly telling. I'm saying something
different: that I have worked with software that used BSAFE, and it used BSAFE
for idiosyncratic reasons that didn't much implicate Dual_EC. I would not, for
instance, be at all surprised to find an embedded device that used BSAFE for
some goofy thing, but used OpenSSL as its TLS library.

~~~
rdl
The only "process in the market" for this kind of work is individuals or
companies breaking random things in public to build their reputation as
auditors (e.g. homokov), or selling vulns. But as you point out it's only
really awesome when you follow the Cryptography Research model: find a new
class of attack, develop the countermeasure, package (potentially patent) the
countermeasure, then make scary demos of the attack to get responsible parties
to buy your countermeasure.

In this case the mitigation is so simple ("don't use PKRNG; avoid RSA, Inc.
going forward.") that it's not monetizable, so it's really only for reputation
points.

------
ig1
A far bigger issue is "The NSA played a significant role in the origins of
Extended Random. The authors of the 2008 paper on the protocol were Margaret
Salter, technical director of the NSA's defensive Information Assurance
Directorate, and an outside expert named Eric Rescorla."

Given Rescorla involvement with the TLS standard and Mozilla it's concerning
that he declined to comment on this.

~~~
doe88
I'm maybe too naive/optimistic but I think the most likely is that he has been
manipulated in co-authoring a document which by itself is innocuous but once
coupled with a rigged prng might have disastrous effects.

~~~
ig1
Sure, he might be innocent, but he really needs to put out a statement
clarifying what his involvement was and what he knew.

Mozilla should probably hire an independent auditor to review all of his code
changes. It might be unnecessary but this seems to be a scenario where it's
better to play it safe (plus always good to audit security related code).

~~~
Consultant32452
>Sure, he might be innocent, but he really needs to put out a statement
clarifying what his involvement was and what he knew.

Why does he need to do that? What benefit will it be to him? If he said he was
unaware of any underhandedness would you believe him? Do you think that if he
was aware that he wouldn't have signed NDAs about these things?

------
jusben1369
I remember when this first broke getting heavily downvoted for suggesting that
RSA wasn't paid $10 million to create a backdoor for their customer's as
everyone was implying at the time. Rather, they (naively) were paid to joint
develop a technology that would be a marketing dream "Hey this included help
from the brightest minds at the NSA so why wouldn't you use it for your own
organization?!"

So I found this quote to be very interesting: "We could have been more
sceptical of NSA's intentions," RSA Chief Technologist Sam Curry told Reuters.
"We trusted them because they are charged with security for the U.S.
government and U.S. critical infrastructure."

~~~
hga
"We trusted them because they are charged with security for the U.S.
government and U.S. critical infrastructure."

Which is true, and as examples we have SE Linux and apparently strengthening
the S-boxes of DES against differential cryptanalysis, which had not yet been
independantly discovered by public researchers
([https://en.wikipedia.org/wiki/Data_Encryption_Standard#NSA.2...](https://en.wikipedia.org/wiki/Data_Encryption_Standard#NSA.27s_involvement_in_the_design)).

However, that was then; now, we legitimately trust the NSA a lot less, but I'm
not sure we should blame RSA too much for not noticing the transition.

I'd also note that I arrived at MIT a couple of years after RSA was first
published, and I heard various rumors that there were rather intense _sub
rosa_ interactions between the team and the NSA et. al. in that period. With a
good outcome, but threats were made etc.

~~~
akira2501
I've brought up the S-Box strengthening several times before, but it's
important to note that the NSA pushed IBM to reduce the key length in DES. IBM
wanted 64 bits, NSA wanted 48, and we all got 56 as a compromise; not entirely
egalitarian work.

~~~
marshray
What I've been told is that 56 bits is pretty close to its actual effective
security in the presence of differential cryptanalysis. So the key length was
'right sized' and it ended up having truth-in-labeling after all.

------
dan_bk
_" We could have been more sceptical of NSA's intentions," RSA Chief
Technologist Sam Curry told Reuters. "We trusted them because they are charged
with security for the U.S. government and U.S. critical infrastructure."_

PS: "We also would like to express that we believe the entire World to be a
bunch of idiots."

------
pbsd
The actual analysis is available here:
[http://dualec.org/](http://dualec.org/)

I don't see much cryptographically new here; authors stretched what is a
simple attack into 17 pages, and noticed that some non-standard TLS extensions
that increase the amount of random data on wire make the attack easier.

~~~
mcphilip
One of the authors of that duealec paper had this to say [1]:

>Our results <dualec.org> say nothing about whether NIST, the NSA, or anyone
else inserted a backdoor into dual ec. Instead, they say that if you know the
relationship between P and Q, then some TLS implementations can be broken at a
given cost. The precise results are more nuanced than that. You can read the
website for a summary or the paper for the details.

[1][http://www.ietf.org/mail-
archive/web/tls/current/msg11730.ht...](http://www.ietf.org/mail-
archive/web/tls/current/msg11730.html)

------
forgotAgain
The NSA is now actually making US infrastructure less secure. Consider that
many companies will forego association with the most advanced security agency
in the country. The country has placed it's best security resources in one pot
and then spoiled it.

They're a resource that could have continued to provide valuable assistance.
Instead, due solely to their organizational egoism, the NSA will be shunned
and their advice dismissed as suspect.

~~~
nullc
Well it's a little more complicated than that: Other than the uniformity
problems dual-drbg is a secure PRNG … against anyone who doesn't know the
matching discrete log for the parameters.

If you don't consider a cryptographically locked single party back-door to be
"less secure" then I don't think we can point out where they've done something
to make things less secure.

------
acd
Most likely also Vasco security tokens since NSA owned Digitalnotar which
Vasco owns.

CA is a flawed security model with governments wanting to read all your data.
Far to easy to forge a CA either by running one covertly or hacking into one.

One subset of users will get man in the middle certificates and interception.

The other set will get normal certificates.

How many really check the certificate fingerprints and who issued the
certificate?

~~~
marshray
I thought the evidence was pretty clear that Diginotar was pwned by the
Iranians.

Those certs did end up on the .ir national ISP's firewall doing man-in-the-
middle attacks on Gmail users within Iran.

------
einhverfr
It seems to me we are seeing a variant of the same problem that Ken Thompson
articulated in his paper "Reflections on Trusting Trust."[1]

While not a compiler intercept of the sort Thompson is talking about, this is
effectively a library intercept of critical crypto functions and it poses the
same basic problem. I don't think the industry will ever be the same. I think
more and more folks are going to demand source-level access to everything
crypto-related, and cryptoanalysts are going to put a lot more effort into
scrutinizing these things.

The NSA failed to heed the basic warning of the Greek drama: ὕβρις-ἄτη-Νέμεσις

Or in more familiar terms, "Pride goeth before a fall."

[1] [http://cm.bell-labs.com/who/ken/trust.html](http://cm.bell-
labs.com/who/ken/trust.html)

~~~
acqq
I just had too look up the Greek words:

hubris [ὕβρις] ‘outrage’; the opposite of dikē [δίκη]

dikē [δίκη], plural dikai [δίκαι] ‘judgment (short-range); justice (long-
range)’; dikaios [δίκαιος] ‘just’

atē [ἄτη], plural atai [ἆται] ‘aberration, derangement, veering off-course;
disaster; punishment for disaster’

nemesis [νέμεσις] indicates the process whereby everyone gets what he or she
deserves

~~~
einhverfr
In other words, the basic formula of Greek tragedy is the hubris or outrageous
pride overwhelming judgement, leading to deserved disaster. There's an
additional subtext btw of exceeding prudent limits in "hubris."[1]

The thing here is that the NSA has acted out of hubris and exceeded prudent
limits, and the result will be predictable. The deserved disaster will come
from far more routine use of encryption, and this will deny legitimate law
enforcement investigations many critical tools. While this is a cost that to
some extent will be borne by everyone, it will particularly hit the NSA and
law enforcement really hard.

[1] See "From Religion to Philosophy" by F. M. Cornford, 1914. Some parts are
dated, but the general discussion of partition and nomos/nemesis is very well
done.

~~~
acqq
Thanks for the book recommendation!

> The deserved disaster will come

It's still hard for me to see that. I don't know any person who started to
more encrypt. And the companies will always have to fullfill the requests of
the state:

[http://www.theatlantic.com/technology/archive/2014/03/don-t-...](http://www.theatlantic.com/technology/archive/2014/03/don-
t-listen-to-google-and-facebook-the-public-private-surveillance-partnership-
is-still-going-strong/284612/)

There's still a lot to be done to induce the significant change.

~~~
einhverfr
> It's still hard for me to see that.

If nothing else, criminals will take encryption far more seriously, which is
where the real cost comes in from a LEO perspective.

> I don't know any person who started to more encrypt.

I have started to encrypt more, and new businesses are springing up with more
encryption built-in. It's also something which has impacted Efficito's
approach to security to some extent.

> And the companies will always have to fullfill the requests of the state

Sure. However as of now, there is no requirement that they be able to fulfil
the requests, so the current struggle is really with coming up with systems
where the administrator/cloud provider has no access. This will lead to
predictable results, and the next battle will be over whether to require such
providers to retain access. To my mind, that's the legal battle we must
prepare for.

------
Zigurd
> _" director of the NSA's defensive Information Assurance Directorate"_

Now how many of you think dividing the NSA into defensive and offensive parts
will do anything to restore trust?

~~~
schoen
One main argument for doing so has been that right now IAD has a conflict of
interest.

~~~
mpyne
Except that it doesn't; NSA has a conflict of interest, but they're already
split internally.

Even if you split them into separate agencies they're still going to _both_ be
part of the USG so while that might reduce the risk of a conflict of interest
somewhat, it wouldn't eliminate it.

After all, think of the existing reaction we get when _actually independent_
agencies of the USG throw out comments in support of NSA's signals
intelligence. If the NSC, Congress's intelligence subcommittees, Executive
Office of the President, Department of Justice, various Inspectors-General,
Dept. of Defense, FBI, and, oh yeah, _the courts_ can all be looped into what
hacktivists claim is a vast conspiracy to violate the Constitution, then what
defense would splitting IAD out really be?

~~~
Zigurd
So then, we should just accept what NSA is doing?

~~~
einhverfr
I think that accepting what they are doing is the first step to fighting it.

------
linuxhansl
> The system, called Dual Elliptic Curve, was a random number generator, but
> it had a deliberate flaw - or "back door" \- that allowed the NSA to crack
> the encryption

I would very clearly draw the line at the point where the NSA is deliberately
weakening national security by undermining widely used cryptography. Companies
or even other government agencies are now easier targets because of the doings
of the NSA.

~~~
mpyne
> I would very clearly draw the line at the point where the NSA is
> deliberately weakening national security by undermining widely used
> cryptography

There was no "threat to national security" here. Unlike what we think of as a
normal backdoor, this one is and remains completely in NSA's control, as long
as they're the only ones with the private key underlying the Dual EC
constants. So even knowing there's a backdoor in there somewhere, there's no
way for other security agencies or hackers to break Dual EC (assuming, of
course, that it's properly implemented, which is _not_ a given, and NSA's
changes may have made it harder to implement).

------
jgalt212
What's the best viable alternative to RSA SecurID?

~~~
cdjk
Probably something like OATH (as used by google authenticator and similar
software) or a yubikey. With yubikey you can even get your own HSM for the
authentication server.

~~~
jgalt212
Thanks. I had heard about yubikey. will def check it out.

~~~
oggy
Might be worth to note that the YubiKey authors did mess up quite badly at one
point in time:

[http://www.lsv.ens-cachan.fr/Publis/PAPERS/PDF/KS-
stm12.pdf](http://www.lsv.ens-cachan.fr/Publis/PAPERS/PDF/KS-stm12.pdf)

------
cordite
Is there an elliptic curve that is trustworthy?

~~~
zorlem
To answer that you will have to define "trustworthy". I'm personally leaning
toward trusting the curves proposed by Dan Bernstein [1], since he clearly
explains the reasons for choosing the specific parameters and they're
demonstrably valid.

[1]: [http://cr.yp.to/ecdh.html](http://cr.yp.to/ecdh.html)

------
higherpurpose
Does the word "NSA" still have a 60 percent upvote penalty here? It seems to
be moving pretty slowly up. I understand using that sort of penalties if
during a period of time the site gets flooded with those stories, but I don't
think keeping the penalty forever is such a good idea, or at least not such a
big one. Maybe make it "decay" over a period of time.

~~~
tokenadult
What is your documentation for saying that a particular word in a headline has
any penalty at all?

AFTER EDIT: I guess I'm glad I asked, downvotes and all, because I see the
link is to a source off HN, not to a statement from any of the HN moderators
(which is what I was checking).

~~~
Crito
Scratch HN moderators, here it is straight from the horse's mouth:

> "So already for the past week TSA stories have had an automatic penalty
> applied. Or more precisly, they've been _autotagged as being political,
> which entails a penalty_."

> "When anything gets over a certain number of flags, it shows up on a list
> that _admins_ see. They decide either to kill it, _mark it as political_ or
> whatever, or do nothing."

[https://news.ycombinator.com/item?id=1934950](https://news.ycombinator.com/item?id=1934950)

So, at least as of 3 years ago, HN both "autotags" and has admin driving
tagging. Penalties are given to posts that have received particular tags.

Strictly speaking, "words in titles" may or may not be involved in that
"autotagging", but given PG's history with content classification... I suspect
it is.

