
NSA Director Says Agency Shares Vast Majority of Bugs It Finds - randomname2
http://threatpost.com/nsa-director-says-agency-shares-vast-majority-of-bugs-it-finds/109170
======
jgrahamc
In the video of that discussion[1] he also says that NSA developed the
Heartbleed patch after hearing about the vulnerability on April 7 and shared
with the private sector on April 8.

Interesting to compare with timeline: [http://www.smh.com.au/it-pro/security-
it/heartbleed-disclosu...](http://www.smh.com.au/it-pro/security-
it/heartbleed-disclosure-timeline-who-knew-what-and-when-20140415-zqurk.html)

[1]
[https://www.youtube.com/watch?v=yhwy2ZWi_y8](https://www.youtube.com/watch?v=yhwy2ZWi_y8)

~~~
scintill76
So, this is going to sound like I'm determined to find a reason to hate the
NSA, but.. this doesn't make them look good either. It's the most accessible
and widely-deployed memory disclosure bug of recent years, if not ever. Surely
there are at least 1000 vulnerable (at the time) servers they'd specifically
love to have this window into, for intelligence on the "bad guys." Surely they
know the "bad guys" would love to look into and attack American servers the
same way. Exploiting and defending this kind of thing is exactly what the NSA
is supposedly for, but they're telling us they didn't know about it.

I guess all their good talent and money was tied up in domestic call metadata
social graph analysis. But at least they had someone smart enough to do the
trivial patch once the vulnerability was known![0]

P.S. I also notice they didn't mention "Shellshock". Why not?

[0]
[https://twitter.com/agl__/status/530004568784916480](https://twitter.com/agl__/status/530004568784916480)

~~~
sliverstorm
So your argument is "If the NSA is any good, they _must_ know about every
vulnerability in the world before the private sector- otherwise they are
incompetent"?

~~~
venomsnake
With the amount of money they spend, I would suggest that the US taxpayer
should get some value out of it.

So while not every - at least a lot.

~~~
scintill76
Right. I agree it's not reasonable to expect any one entity to know every
possible vulnerability in the world. But with all the money and brains they
have, how huge this particular target was (for both defensive and offensive
purposes), and how high the stakes are (as they remind us whenever we
challenge them), you'd think they'd have static code analysis flagging these
kinds of things the moment they're checked in to a public repository of any
notoriety, fuzzers running 24/7, "shadow" code review against things like
OpenSSL by their best hackers, etc., at which point it does almost become
incompetence that they either don't have that, or it failed. At least, I'm
wondering what all that money is for.

No, it doesn't really prove incompetence, but on the other hand "we released a
trivial patch for an issue the private sector already found and fixed"*
doesn't prove competence either, or inspire any sympathy from me. These are
the types of people who trot out imagery like "cyber Pearl Harbor" when
they're telling us why they need to be granted more power to protect us, yet
their go-to public example of fixing a critical vulnerability in the national
digital infrastructure is laughably irrelevant.

* Depending on what time of day it was, the official fix may or may not have been publicly released when NSA did whatever they're claiming they did.

------
c0achmcguirk
I've talked to some employees of TrueSec (the firm that just discovered the
Yosemite "rootpipe" exploit) and they told me that many security research
firms are "on retainer" with intelligence agencies. They didn't share
specifics but they said that they have friends who are paid to share exploits
with, say, British Intelligence and to _not_ report it to the vendor with the
exploit in it.

I assume this is why Stuxnet had so many zero-day exploits in it. The agencies
behind it had security firms feeding them.

[http://www.wired.com/2014/11/countdown-to-zero-day-
stuxnet/](http://www.wired.com/2014/11/countdown-to-zero-day-stuxnet/)

~~~
xnull2guest
Yes.

The market for 0days has been cooling off in recent years, but for a good
decade there you could sell 0days, even mediocre ones for six digits. Nowdays
you'll need a pretty good vuln for six digits, and something pretty stellar
for seven (this isn't unheard of). ZDI, frsirt and others got into the game as
middlemen. They allow(ed) you to not know who the final purchaser is and would
allow you to sell 0days that may or may not be interesting to a government
entity - in this case ZDI, etc would swallow the cost.

Edit: [http://www.vupen.com/english/services/lea-
index.php](http://www.vupen.com/english/services/lea-index.php)

~~~
tptacek
What is the least interesting vulnerability whose sale you have firsthand
knowledge of that fetched more than $20,000?

Have you personally ever sold a vulnerability?

~~~
grugq
FWIW I don't agree with the assessment of the OP. While there are some
security firms that do have contracts, the vast majority of NSA capability is
internally developed (or developed under contract by defence contractors).

As for the "market assessment" I find it implausible. It seems to be based on
the assumption that the demand for capabilities has decreased over time while
the availability of good bugs has increased. This is at odds with reality.

~~~
tptacek
I believe you on this a lot more than I believe anonymous employees of a firm
known principally for giving a brand name to an Admin->Root privilege
escalation bug.

------
nullc
Then where are the fixes? They spend untold billions per year... their visible
bugfix output is very low, including in low level cryptographic domains that
you might expect them to be power-houses.

The claim makes me think either that they're lying about sharing what they
find (either intentionally or via institutional stupidity); or they're really
inept and not finding much at all compared to much less well funded OSS
developers and participants in industry.

It would be interesting for someone to setup a scorecard site to document
NSA's infosec contributions.

~~~
tptacek
Bug volume in crypto is also very low, and the "fixes" to major crypto bugs
tend to take the form of entirely new constructions... which users are not
happy to get from NSA (this was a problem even in the 1970s!)

So I'm not sure this is a valid critique.

~~~
xnull
Bug volume in crypto is extremely high. How many developers reuse IVs in
stream ciphers? How many blindly use AES or somesuch other symmetric library
and then build in no authentication whatsoever? How many antequated
implementations of RSA are used in practice today (see recent Bleichenbacher
flaw in NSS)? How many times are poor chaining modes for block ciphers chosen?
How many implementations of [anything] fail on side cases (elliptic curves) or
massively leak through side channels? How many DH-family protocols miss checks
for identity inputs?

The answer is a lot.

~~~
tptacek
You and I mean different things by "crypto vulnerabilities". I took the parent
comment to mean things like the RC4 biases; like I said, things for which the
"fix" would involve entirely new algorithms or constructions. An example of
this kind of NSA disclosure would be the DES s-boxes.

Crypto software implementation vulnerabilities are very common, but the kinds
of things you're talking about are most often found in obscure and/or
serverside software. Look at the tempo at which bugs like the NSS e=3 bug are
released; it's like once or twice a _year_.

~~~
xnull
I think implementation bugs are within the spirit of OP, especially provided
the NSA claims to have provided an implementation fix for Heartbleed.

The sorts of bugs I'm talking about exist in client and popular software. As
far as tempo is concerned this year alone has given us BERserk, gotofail,
Android Master Key, OpenSSL fork(), Bitcoin's use of P256, GNUTLS X.509
parsing bug, the OpenSSL compiler optimization+processor family randomness
bug, and others.

If we were to entertain OP's point maybe there would be a faster tempo if the
NSA were helping out. :)

~~~
tptacek
Sure, if this is what we mean by the kinds of cryptography bugs NSA is a
powerhouse at, I'm sure they could be leaking more of them to industry.

------
JBNixx
I assume they do share the majority of vulnerabilities they find, but keep the
top 2% for their own use. By top 2% I mean the vulnerabilities that are highly
unlikely to be discovered by other nations.

The other 98% of zero days that are easy enough to stumble upon by foreign
cyber units might be better to disclose and get fixed.

If the NSA can find a bug relatively easy, then we can assume China(example)
might be able to as well. Getting those bugs fixed is a big gain for national
security. Although, it will boost security of all nations.

------
higherpurpose
I don't buy it. I think he's just trying to make NSA look "useful" to private
companies, so they support laws like CISA, where sharing between NSA and tech
companies will be forced.

------
justanother
So they share a "vast majority," but still keep many to themselves because
they don't think anyone else is persistent enough to find them ("How likely
are others to find it?"), in addition to telling a lie about developing the
fix for Heartbleed. Nothing new here.

~~~
sarciszewski
Wasn't that developed by Adam Langley? (Google employee.)

------
doe88
It may or may not be true, having repeatedly lied in the recent past they have
lost all credibility.

------
tlb
That must be an interesting decision to make. Presumably, they'd only keep
quiet about bugs they were fairly confident that unfriendly governments
weren't also likely to find and exploit. I assume they'd also launch a big
honeypot, so if someone else started to exploit it they could could change
strategies.

It's the New New Great Game.

------
wavefunction
They've been lying so much at this point I don't know what it matters what
they say. They were so caught up in seizing power with no regard to the
consequences that they forgot that Trust is most precious and fleeting and far
more valuable than lying about your law-breaking.

------
x0x0
So, the NSA are, to a person, lying sacks of shit. After their director's
performance in front on congress -- the "least untruthful answer possible" \--
don't trust a damn word.

So when they say they share, with whom? And what priority? Ooh, you found a
documentation bug; is that the one you chose to share? The more severe a bug
is, the more useful to them.

There's a million ways to parse this bullshit that come down to mean they're
doing what they did all along but better at lying in public.

~~~
dalke
I want to be more specific about the concept of sharing.

There are two uses of "sharing" in the response. One was sharing of
vulnerabilities. It was never clear who it was shared with. It could be with
the DoD, with GCHQ, with private contractors under contract with the NSA, etc.
and still be counted as "sharing."

The other is an example of sharing one patch, for the Bash vulnerability, with
the private sector.

I think we are supposed to connect those two pieces of data, but there's no
reason to do so.

------
wsloth514
The problem is... I DO NOT TRUST YOU.

------
crumpled
lol, "vast majority". That's like telling your spouse "Most of the time, I
don't cheat on you" full-disclosure: I didn't read the article.

------
biesnecker
Do they disclose them before or after they exploit them?

------
eyeareque
I'm sure they share them as soon as they discover someone else is aware of the
same exploits.

------
paulannesley
Director of lying overreaching unconstitutional spy agency makes unprovable
claim that it's not stockpiling digital weapons. I think I'll skip reading
this one.

------
djcapelis
So how often do they share bugs? Has anyone seen it happen with any
regularity?

