
Google, the Wassenaar arrangement, and vulnerability research - _jomo
https://googleonlinesecurity.blogspot.com/2015/07/google-wassenaar-arrangement-and.html
======
mmaunder
A couple of items I'd like to add: (I'm Founder/CEO of a busy infosec biz)

While infosec is currently a smaller sector in startups than say social,
casual gaming, apps, etc, it's growing furiously and Wassenaar and individual
country regs will be top-of-mind for much of the YC community in the years to
follow because many of you will be in this space.

I'd like to emphasize one of Google's points: "Global companies should be able
to share information globally." With existing laws we are already running into
limits on where we can hire and what our own internal staff can send us. So it
really is critical for any small or large global org that internal comms are
not squashed by this.

BIS's comment period ends today, so if you want to take action, now is the
time. As in, before COB today.

One last thing: The Hacking Team compromise and stolen data (including the
zero-days they were hoarding) couldn't come at a worse time. It is fuel for
the argument that zero-days, vulnerabilities, vectors etc should be tightly
regulated and it's really thrown some weight behind the argument to close
borders (WRT info exchange) rather than open them. So your help is really
needed on this if you think you should be able to have open conversations
about technical infosec issues with your colleagues in other countries.

~mark

~~~
click170
> "Global companies should be able to share information globally."

This is an interesting point, and to me it seems to be pitting global
companies against individual countries. This makes me feel uneasy because I
feel there's a better chance my government is looking out for my best
interests than many global companies are. Am I just being paranoid and naive?

~~~
mmaunder
That's a really big debate: small vs big government, Keynesian vs Hayekian
economics, public vs private sector and so on. I'd say from my side that
government and private sector have a role in protecting you.

Government tends to excel at things like air traffic control and lighthouses
where there isn't much scope for competition and where competition may harm.
Private sector tends to do things more efficiently when competition is
feasible.

So both have a role and the role of private sector is an important one. If we
pass laws that restrict innovation, the risk is that things that would
normally be in private sector will become government roles and won't be done
as efficiently or as effectively as they could be.

I'd say at the rate we're seeing attacks escalate, providing the most
effective protection we can in infospace is very important. I'm completely
ignoring issues like investment opportunities, job creation and so on because
I think the core problem we're all trying to solve is to protect individuals
and businesses. We need to get that right first and move from there.

~~~
LunaSea
I'm sorry but I don't see how a private company "protects" you in any way.

~~~
lmm
It doesn't have to be about protection. Just about serving your interests.
When my country privatized the phone system, the time to get a phone line
installed dropped from 3 months to 5 days (because suddenly they were
motivated to get it installed as soon as possible so they could start making
money).

------
4d004anonymous
I have quite an interest in this area too. Here are a few things that may
help:

1) Commercial Penetration Testing Software is already controlled. Here's a
self-post to reddit on how these controls work and apply today:

[https://www.reddit.com/r/netsec/comments/36obxt/what_i_know_...](https://www.reddit.com/r/netsec/comments/36obxt/what_i_know_about_us_export_controls_and_hacking/)

If you choose to comment; it's helpful to understand the current law and how
it works. This way it's easier to know what to ask for.

2) If you use software that may be controlled (for example: pen testing
software); this issue affects you. Here are a few suggestions of things you
could put in your comment:

[https://www.reddit.com/r/netsec/comments/3dusae/the_public_c...](https://www.reddit.com/r/netsec/comments/3dusae/the_public_comments_on_the_proposed_wassenaar/ct9rjqu)

3) As of last night, there were 101 public comments posted. Most were far
below the quality you would hope for a good discussion on HN, let alone a note
to a policy maker to request a change. If you have an interest in this area
and have something constructive to suggest--the public comment process is your
opportunity to do it.

[http://www.regulations.gov/#!docketBrowser;rpp=25;so=DESC;sb...](http://www.regulations.gov/#!docketBrowser;rpp=25;so=DESC;sb=postedDate;po=0;dct=PS;D=BIS-2015-0011)

------
MichaelGG
>You should never need a license when you report a bug to get it fixed

That hampers people that wish to publicly disclose or sell vulnerability
information. This is massively biased in favour of software companies (to some
extent, like Google). You should never need a license to disclose
vulnerability information, full stop.

>Global companies should be able to share information globally

Why should this be limited to the employees of a company?

~~~
comex
_Public_ disclosure apparently won't require a license anyway:

> Third, export controls do not apply to any technology or software that is
> "published" or otherwise made publicly available.

[http://bis.doc.gov/index.php/policy-
guidance/faqs#subcat200](http://bis.doc.gov/index.php/policy-
guidance/faqs#subcat200)

(The FAQ also states that information about vulnerabilities, as opposed to how
to exploit them, would not be controlled, but I believe Google if they say the
legalese is insufficient to establish this.)

I agree that defining boundaries rigidly in terms of companies would be
limiting in today's world and especially in infosec.

In general, though, I personally really despise the practice of selling
vulnerabilities for the purpose of enabling people to attack others with them
- which in practice means selling them to anyone but the vendor, or
intermediary organizations like ZDI. True, there are so many ways for this to
go wrong... but I cannot join with some of the infosec people who blast any
regulations on the industry as inherently harmful, infringements of freedom of
speech, useless against the real bad guys, etc. Even as I hesitate to even
_think_ in terms of things like 'increased threats' or 'acceptable
infringement', or oppose 'absolutist thinking', considering how harmful such
ideology has been in other, quite different but analogous realms
(surveillance, airport security), and while I have little faith in the ability
of a government so hyped up about "cyber" threats to avoid serious blunders, I
simply cannot bring myself to find the current almost total lack of regulation
in infosec, which you hint at in saying a license should never be required to
share information, acceptable.

~~~
AnthonyMouse
There is a very simple reason why "regulation" in this space will never do
anything useful. The cost of entry is very low (an individual can find
vulnerabilities without any specialized infrastructure or organizational
backing), and the value of vulnerabilities is high. It's basically the war on
drugs if drugs could be transferred over the internet. Worse, the majority of
the offenders are not in your jurisdiction to begin with, and they never have
been or will be.

It doesn't matter what law you pass, you will not stop this from happening.
But just because passing laws can't do any good doesn't mean it can't do any
bad. Bad laws can still do plenty of harm to the good guys.

The best thing the government could do in this context is to be the highest
bidder and then immediately disclose the vulnerabilities to the vendors.

~~~
comex
As much as I would like the government to make such bids, I don't agree
regulation is useless. Sure, no policy can completely prevent zero days from
being sold - in fact, this particular policy doesn't even try; it just limits
who you can sell them to. But if that means that organizations and individuals
who wish to remain respectable and avoid any trouble with the law, however
unlikely it is to be enforceable in practice, limit their trade to _quite_
nefarious actors rather than _extremely_ nefarious ones... it's better than
nothing.

edit: that is, it's better than nothing if it avoids harming the good guys too
much, and as I said, I am skeptical of many of the critical comments that have
been made, though, buying Google's, I hope the rule will be amended. Argh, I'm
too tired to express myself properly.

~~~
AnthonyMouse
> that is, it's better than nothing if it avoids harming the good guys too
> much

In theory there is an ideal rule with ideal enforcement that will cause less
trouble than it prevents. But as Yogi Berra once said, in theory there is no
difference between theory and practice; in practice there is.

Here's a example of a serious problem this actually causes. Suppose Nefaristan
is on the list of places nobody can sell to. The evil government of Nefaristan
will just send an operative to Jordan or Saudi Arabia or whatever nominally
less nefarious place didn't make the list, and buy their exploits there. So
either way the evil government of Nefaristan will have embargoed exploits to
use against against their domestic dissidents. The dissidents need the
embargoed patch right away or they'll be found out and executed. But now the
stupid law prohibits anyone from giving it to them because they're in
Nefaristan.

It's difficult to imagine how a law could fail harder than "helps bad guys
send good guys to death camps" \-- but here we are.

Causing serious harm is not better than doing nothing.

~~~
comex
Having read the definitions of what is controlled in the proposed rule, I'm
pretty confident a patch wouldn't come close. And in any case, since the rules
don't apply to public software, that only matters in the case of _private_
patches, which aren't really a thing, and would be a pretty big moral hazard
if they were.

~~~
AnthonyMouse
Most patches inherently reveal the vulnerability they fix. Patches not being
controlled would be a loophole big enough to fit a whole planet through.

And private patches are a thing. Vendors often distribute an early version of
the patch to major customers for validation testing.

Or if you like, substitute "patch" for vulnerability information that enables
a workaround. You can defeat Heartbleed by turning off TLS heartbeat support
but that information is enough to quickly reverse engineer the vulnerability.

~~~
comex
Out of curiosity, have you read the actual proposal?

~~~
AnthonyMouse
The actual proposal is dozens of pages of legalese that would take a team of
lawyers a week to decipher. I have no idea what it says because it is totally
incomprehensible.

That's half the problem. If you're AT&T or Google you can hire said team of
lawyers to tell you what it says, but what is an individual graduate student
or security consultant supposed to do?

The other half of the problem is that what it says doesn't change the outcome,
because the insolubility of the issue comes from economics rather than policy.
There is no policy that will keep vulnerability information out of the hands
of the bad guys _only_ , because there is no practical way for most people to
even identify who the bad guys are.

------
coldcode
Are these governments completely insane? Not allowing security research or
even reporting bugs without getting a license is the stupidest thing I've
heard in my lifetime. The internet cannot be regulated in this fashion without
destroying it completely. The world is not a collection of islands we are all
in this together and letting any government stand in the way of safety and
security is insane.

~~~
caf
I don't believe this was intentional. Wassenaar is an arms control agreement -
the intention was probably to regulate the sales of weaponised exploits, not
basic research. It's an overreach.

~~~
benmmurphy
A lot of research will be very close to weaponised. Just browse googles bug
tracker and you will find PoCs for a lot of their vulnerabilities.

------
Natsu
The problem I have with this sort of arrangement is that the more rules they
put in place on lawful disclosure, the more they hamper the good guys who
follow the laws and the more of a comparative advantage that gives to the bad
guys who can simply ignore the laws.

We really don't want to hand the bad guys any more advantages than they
already have, no matter how good our intentions are.

------
tdullien
Hey there,

if anyone wants some more background on all the negative side-effects of the
current regulation, I wrote a lengthy blog post on the problems with the
current phrasing of the Wassenaar amendments here:

[http://addxorrol.blogspot.ch/2015/05/why-changes-to-
wassenaa...](http://addxorrol.blogspot.ch/2015/05/why-changes-to-wassenaar-
make.html)

(Background: I am a security researcher who designed industry-standard patch
analysis algorithms / software, built algorithms & a startup for malware
reverse engineering that was acquired by Google, pioneered several
exploitation techniques, and worked heavily on the recent Rowhammer
vulnerability)

------
thephyber
Wired had a "call to arms" a few days ago over this comment period:
[http://www.wired.com/2015/07/moussouris-wassenaar-open-
comme...](http://www.wired.com/2015/07/moussouris-wassenaar-open-comment-
period/)

------
raythrop
> Global companies should be able to share information globally. If we have
> information about intrusion software, we should be able to share that with
> our engineers, no matter where they physically sit.

This statement goes through just as well when applied to missiles, nuclear
engineering knowledge, bio-weapons knowledge, etc.

Governments have decided that they wish to use commercial entities as a proxy
method for protecting the status quo. If Google wish to challenge that policy
then, well, OK. But there is no reasonable argument for making a special
exception for "cyber security" over other forms of security-related
engineering.

~~~
airza
The key difference between those technologies and this is that industrial
production hardware (centrifuges or specialized biological equipment, etc) are
needed in addition to specialized knowledge of the topic at hand. These
additional requirements make export controls a lot more enforceable, and why
you can, say, send regulators to the site of a nuclear engineering facility
and get a reasonable answer about if they have sent nuclear materials to
another country.

With infosec this would be basically impossible, since the specialized
knowledge and the equipment are both non-physical and intimately entwined.

~~~
oracuk
This is not a practical distinction with regard to the legislation.

A word document describing the specification for an export controlled
technology is as prohibited from export as the implementation.

Having the word document on a laptop you take to another country is as much a
breach of the law as shipping a centrifuge.

------
keso_77
Global companies can go screw themselves, Google in particular. "People should
be able to share information globally." is the motto everyone including global
companies should rally behind. The crypto export ban is an apt analogy and it
did hinder American companies and benefited everyone else. A good thing too
when the NSA started monkeying around with crypto primitives and national
security letters. Anything that slows down american tech behemoths and gives
room for other european alternatives is a good thing.

------
richardw
Previous export restrictions on crypto and certificates gave non-US
participants (Mark Shuttleworth) the opportunity to take up the slack and
build a business where there were no such restrictions. He simply had to
fulfill the demand that existed outside of the US in large part due to export
restrictions.

The Wassenaar Arrangement is likely to result in similar unintended effects
combined with a similar lack of intended effects.

------
tankenmate
The UK and Australia have already implemented part or all of these agreements.
Does Google already hold licenses in these jurisdictions? If we report a
security bug to a Googler in these (or other similarly restricted
jurisdictions) will they be able to share security bug details with their
overseas Googler colleagues?

~~~
gpvos
If I understand correctly, the UK and Australian implementations are much
narrower, and much less problematic than the proposed US implementation. So
presumably licenses will not be necessary in those other countries.

------
gioele
I find the "Summary of international crypto controls" at
[http://www.cryptolaw.org/cls-sum.htm](http://www.cryptolaw.org/cls-sum.htm)
very informative.

It shows that the _export_ controls have been in places for years and that is
something that should worry us.

------
oracuk
It's probably worth remembering that there is an open source exemption. If a
piece of 'intrusion software' has been published the export controls no longer
apply.

Publish your exploits to github before you send them overseas or travel to the
conference to announce them.

------
themeekforgotpw
Aren't cyberarms arms?

Isn't the right to bear arms an 'inalienable right'?

I don't get it. And I don't get why this is a 'privacy' or 'free speech' issue
or why corporations, as Google argues, should be exceptions to the law.

~~~
tptacek
You should read the whole story before commenting; this is about foreign trade
in exploits.

~~~
themeekforgotpw
Inalienable rights. Not civil rights.

Inalienable rights are human rights - which extend (at least in theory) to
foreigners.

I read the story.

But anyway if the Supreme Court ruling holds from Zimmerman it would apply
equally well to everything in the article. Of course the Zimmerman case was
about foreign exports as well.

Try to be charitable.

~~~
themeekforgotpw
If the downvoter needs help understanding the historical analogy that makes
this above comment make sense:
[https://en.wikipedia.org/wiki/Pretty_Good_Privacy#Criminal_i...](https://en.wikipedia.org/wiki/Pretty_Good_Privacy#Criminal_investigation)

------
ufydtdtdtfu
The solution is to ignore the law. Governments have no moral legitimacy and
therefore no legal legitimacy.

~~~
JoshTriplett
They tend to react poorly to being ignored, though. And as a rule, they're not
the ones that fare badly in such a conflict.

