

The Mercenaries: Ex-NSA hackers are shaping the future of cyberwar - dthal
http://www.slate.com/articles/technology/future_tense/2014/11/how_corporations_are_adopting_cyber_defense_and_around_legal_barriers_the.single.html

======
tptacek
Also: the closing grafs in the article, about Cisco's acquisition of
Sourcefire, are particularly dumb.

Sourcefire is the commercial backer of Snort, the open source network
intrusion detection system (and also the owners of ClamAV). The author of this
article and his sources express surprise that Cisco would pay big money for an
open-source product that anyone can use.

Cisco paid just about 10x trailing revenue for Sourcefire, a public company
that had managed to dominate enterprise network security and which competed
directly with products that had been cash cows for Cisco for over a decade.
Cisco has for as long as I've been in the industry --- in fact, for as long as
there's been that industry --- been the single most important acquirer of
network security companies. They acquired security companies with the same
fervor in 1998 as they do today.

Cisco's acquisition of Sourcefire might qualify as the single least
interesting story in information security in the last 5 years.

Want to make a couple hundred million dollars? You too can do what Sourcefire
did: start an open source project that appeals to enterprise teams who spend
monopoly money to buy products (that is, start any enterprise-relevant open
source project). Get thousands of people to use it. Then start a company and
hire an inside sales team. Have them call company after company and ask, "Do
you use our open-source project?" Sell extra stuff to the people who say
"yes".

------
raesene4
FWIW, I think that the idea of software bugs (vulnerabilities) as a product is
a scary concept and a bad precedent for overall security.

Once you have legitimate corporations who's goal is to find software
vulnerabilities, combine them with delivery systems and sell to either
specific entities (e.g. the US government) or the highest bidder, I think that
the incentives for people involved in software development and testing get
odd, and not in a good way.

For example, will we see these companies hiring ex-developers and testers from
software product companies, as they might have inside knowledge of where
products are weak.

Another example is, are there incentives now for people who work in
development or testing, who aren't perhaps happy in their jobs, to sell
knowledge of bugs or flaws to these companies? Given the prices paid, which
could be several multiples of peoples annual salary, and the anonymity
afforded to people who report the flaws, it could be a low risk way to make a
lot of money.

And then you have open source software which is heavily used in a lot of
commercial products that might get attacked. With this kind of thing there's a
big incentive not to report bugs to the project but to sell them to a company
who has no incentive to see them fixed...

~~~
tptacek
I'm uncomfortable with vulnerability markets for other reasons. But, anyways,
you write:

 _For example, will we see these companies hiring ex-developers and testers
from software product companies, as they might have inside knowledge of where
products are weak._

Two things.

First, you're not clear on why this would be a bad thing. The flaws are there
whether insiders out them or not. The implication in your comment is that we'd
be better off with those flaws kept secret. Obviously, we'd all be happier if
the vendor outed their own flaws, or if a non-"mercenary" researcher outed
them for public consumption. But even private vulnerability sales have the
effect of eventually burning the bug.

Second, it's a little naive to think that most flaws are known only to
insiders. In fact, the advantage insiders have in getting full access to
repositories is probably dwarfed by the advantage attackers have in committing
entire careers to studying exploitable bugs. For most competent researchers,
lack of source code is just a speed bump.

~~~
raesene4
Realistically in my experiences every large company has "skeletons", which are
primarily known to insiders and of which outsiders have less knowledge.

Be that a product which is known not to have as rigorous a security regime as
others, or perhaps a service which is considered "legacy" and not developed
actively.

When offensive companies start hiring people to get access to that information
to use against their prior employers, I think that's not great for overall
security.

I didn't think I was implying that keeping flaws secret in the long term was
desirable, I don't think it is.

Also whilst I agree it's naive to suggest that only insiders know most flaws,
I feel that it's also reasonable to suggest that insiders have information
which would be useful to attackers, and that could be tapped by hiring them.

As I said originally that was just one example of where I think potential
problems could arise from "vulnerabilities as a product", but I'd be
interested to hear what you think are the downsides to vulnerability markets.

~~~
tptacek
You seem to be describing companies that are institutionally concealing
serious product flaws from their customers, and suggesting that overall
security as a public policy goal is improved by a strategy of just hoping that
reverse engineers won't notice.

~~~
raesene4
that's not what I was intending to describe. I was suggesting that insiders
have inside information and sometimes that is relevant to attacking companies,
such that hiring those insiders could be useful to them.

and definitely not hoping that reverse engineers won't notice, I've been in
security long enough to see all my pronouncements of "you know someone could
do x" and more come true...

~~~
tptacek
I'm not sure that you've clarified anything with this comment. "Insider
knowledge of information relevant to attacking software" is "insider knowledge
of product flaws". Flaws need to be fixed, not concealed.

~~~
Spooky23
Institutional knowledge is about process as well as the software. Knowing the
magic words and people can make social engineering or avoiding countermeasures
much easier, even in the absence of an explicit software flaw.

~~~
tptacek
Now we're playing Six Degrees of Kevin Bacon. We start out with moles
inserting vulnerabilities. Then it's insiders who know about flaws. Then
insiders who know about weak spots to look for flaws in. Now it's magic words
to help with social engineering. At some point, these stop being important
considerations for public policy.

------
tptacek
"A survey of 181 attendees at the 2012 Black Hat USA conference in Las Vegas
found that 36 percent of “information security professionals” said they’d
engaged in retaliatory hack-backs."

What? Black Hat attendance is in the high thousands. A plurality of those
attending are IT professionals --- people that wouldn't have the technical
capability to take over a botnet even if they wanted to. Even if you broadened
the definition of "hacking back", as some people do, to recon activities like
port scans. No part of this anecdote makes sense.

For my part (I'm a security researcher by background, though that's not what
I'm doing now, and I've presented at Black Hat numerous times): not only have
I never met a professional who claimed to have "hacked back" anything, but
I've never even met one who didn't think that was a crazy idea.

There is a difference between major organized efforts to bring down botnets
and "hackback" the way the term gets associated with Endgame.

~~~
sarciszewski
> I never met a professional who claimed to have "hacked back" anything, but
> I've never even met one who didn't think that was a crazy idea.

Coming from the amateur side of things, my observations mirror your
observations. The only people who think it's a good idea, from what I've seen,
are script kiddies-- not professionals.

------
Kalium
I interviewed with Endgame recently. Their arrogance was striking.

More topically, there's a basic problem in security - vulnerabilities have
value. They have more value to people who want to use them than to people who
want to close them. Unless this shifts, the current situation is only going to
get worse.

Making it illegal isn't going to work. There is already a functional black
market. Removing the white market will just drive more groups to the black
market.

There's no easy answer here. Yesteryear's EFNet junkies have been turned into
today's mercenaries and weapon designers. Cyberspace is valuable, and
controlling it moreso. It's a dangerous time to have interesting information.

~~~
tptacek
Worth adding: even basic software security engineering services are, compared
to other services, spectacularly expensive. In ten years of software security
consulting for big companies, I met with very few who didn't get sticker shock
from the cost of even a basic web app assessment.

Supply/demand is a motherfucker. The solution is probably going to have to
focus on the supply side.

~~~
Kalium
A lot of basic stuff can be automated, but that only goes so far. Security
engineering is becoming its own distinct and highly specialized discipline,
and the supply is probably always going to be limited.

I think a better answer is for companies to take security more seriously from
the beginning. This means being willing to invest in developer training and
in-house infosec. The expense of outside expertise should be ample reason to
bring that inside.

------
JoachimSchipper
The company profiled in the article ("ex-NSA") isn't exactly the first player
in its space - e.g. VUPEN is a pretty established company
([http://www.vupen.com/english/services/lea-
index.php](http://www.vupen.com/english/services/lea-index.php)), and there
have been earlier articles on this market (for instance
[http://www.forbes.com/sites/andygreenberg/2012/03/23/shoppin...](http://www.forbes.com/sites/andygreenberg/2012/03/23/shopping-
for-zero-days-an-price-list-for-hackers-secret-software-exploits/) is pretty
readable).

This may be a very good book, worth reading, but it's not really news.

------
tw04
We need to allow corporations to fight back? Why stop at cyberspace, I think
we should let multinational corporations field their own private armies as
well. What could possibly go wrong? It's not like they'd ever abuse that
power!

------
snlacks
I just want to make sure I have this right.

The government hires these guys and then keeps the vulnerabilities in our
software and our businesses' software secret?

They then use this to launch attacks and record our communications and
actions?

I, of course, would have an opinion on this, I just want to make sure I've got
this correct.

Edit: punctuation.

~~~
tptacek
That is not a very good summary of Endgame or VUPEN.

~~~
kelvin0
So asking questions is also grounds for being down voted? I am confused ...

~~~
snlacks
That doesn't matter, people will up or down vote. It was, indeed a question
loaded with my preexisting bias, but also a genuine question.

~~~
arkem
VUPEN and Endgame are companies that employ people to do vulnerability
research and develop exploits.

They sell a subscription service that provides access to a catalogue of their
exploits to Government groups (law enforcement and intelligence agencies
mostly). Depending on the company the list of acceptable clients will vary,
some of these firms sell only to the federal agencies of 5-Eyes nations,
others will sell more broadly than that, some may only sell to ${Local SIGINT
agency}.

Government groups might do any number of things with these exploits but
typically law enforcement will use them to execute warrants to help in their
surveillance of suspects. Intelligence agencies may use them in the same way
(pursuant to their authorizations). Other customers might somehow try and
defend friendly networks with the information but this doesn't work.

I'm not sure what in particular tptacek objected to but my guess is
characterizing them as part of the Government. The Government isn't keeping
any secrets here (except for the ones they're presumably contractually
obligated to keep by Endgame / VUPEN / etc) and the vulnerabilities have been
discovered before the Government has contracted with the supplier.

~~~
snlacks
Sounds like part of my summary's issue was grammar/word choice. I definitely
understand the problem now and will be more careful.

New summary: The governments (plural) hire these companies (as opposed to
"guys") and may, but don't always, keep these software vulnerabilities secret
in order to collect information on people/targets. This is sometimes done with
a warrant and in other times is done without the need.

I don't see how the governments and these companies aren't keeping the
vulnerabilities in the software we rely on secret, I need more convincing.

I really appreciate you taking the time to flesh it out with me, even though
it's unlikely we'd end up agreeing (just from hints in the tone we're using),
I'm glad it won't just be over poor writing on my part. Thanks!

~~~
tedunangst
Yes, the vulnerabilities are kept secret. The value of an exploit decreases
significantly after the vulnerability is patched, and they are in the business
of selling high value exploits. If they couldn't sell the exploits, they
wouldn't be finding the vulnerabilities either. Banning exploit sales won't
suddenly result in VUPEN turning into a vuln finding charity.

For whatever it's worth, zeroday exploits are rare in practice. The vast
majority of exploited systems are taken down with public vulns because they
weren't patched in time. Very few organizations are interested in specific
targets; carpet bombing the internet and searching for unpatched
shellshock/drupal/etc installations will collect enough low hanging fruit.

~~~
snlacks
If the NSA buys an exploit in Windows, does the NSA's contract preclude VUPEN
from selling that exploit to Microsoft?

~~~
tedunangst
Presumably. If you're paying for an unpatched vuln, you don't want to get a
patched vuln.

------
Zigurd
Allowing this is like allowing the development of engineered bioweapons and an
open market selling them to the highest bidder.

~~~
p0x33
A couple of ways that vulnerability research is nothing like bioweapon
engineering:

1\. Vulnerabilities exist in software, whether a researcher uncovers them or
not. America's adversaries will continue to identify and exploit these
vulnerabilities. So legislation that keeps this information out of the hands
of our defense/intelligence community would really only serve to weaken us
relative to our enemies, rather than making us safer.

2\. Die Hard movies aside, software vulnerabilities are far less likely to
lead to apocalyptic outcomes than nuclear or biological weapons. Maybe a
better analogy would be the open market manufacturing of surveillance
technologies, like cameras and radios. Where's the outrage about telephoto
lenses that camera companies make that can be used to monitor our enemies or
to take pictures of your daughter in your backyard swimming pool from a half-
mile away?

~~~
Zigurd
> _America 's adversaries will continue to identify and exploit these
> vulnerabilities. So legislation that keeps this information out of the hands
> of our defense/intelligence community would really only serve to weaken us
> relative to our enemies, rather than making us safer._

Encouraging freelancers to find vulns and sell them to the highest bidder also
makes us less safe. It also encourages vulns to be created in projects for the
purpose of later selling them.

Secondly, the American point of view on this is colored by the fact the US has
never been the target of a cyberweapon of the power of stuxnet, designed to
cripple a large and critical military or industrial system.

