
Google Workers Urge C.E.O. To Pull Out of Pentagon A.I. Project - s3r3nity
https://www.nytimes.com/2018/04/04/technology/google-letter-ceo-pentagon-project.html
======
betolink
From the movie "Flash of Genius" about Robert Kearns, the inventor of the
intermittent windshield wiper:

"I can't think of a job or a career where the understanding of ethics is more
important than engineering," Dr. Kearns continues. "Who designed the
artificial aortic heart valve? An engineer did that. Who designed the gas
chambers at Auschwitz? An engineer did that, too. One man was responsible for
helping save tens of thousands of lives. Another man helped kill millions."

"Now, I don't know what any of you are going to end up doing in your lives,"
Dr. Kearns says, "but I can guarantee you that there will come a day when you
have a decision to make. And it won't be as easy as deciding between a heart
valve and a gas chamber."

To me this is incredibly valid for Silicon Valley engineers these days.

~~~
cmontella
As a roboticist at the beginning of my career working on drones, I decided
then and there that I would never make "bombs", a metaphor I used to mean
anything that could be weaponized. I realized a lot of the work I was doing
was funded by DARPA, and I was very cognizant about my research being used in
this way. And like Dr. Kearns suggests, it's not entirely black and white.
Would my path planning algorithm be used to more efficiently deliver
scientific payloads to the atmosphere, or would it be used to route missiles
to maximize casualties? Hard to really say, but I've avoided overtly military
applications (even things like BigDog, designed to carry equipment for
troops).

Sometimes the distinction is even more insidious. I did work on perpetual
flight for drones, and Facebook had a perpetual flight project that had the
goal of bringing internet access to remote locations in Africa. Sounds
humanitarian, but I also didn't want to be responsible for subjecting poor
Africans to what I consider Facebook's panopticon. Or maybe it would have been
a net boon for the region? It's really hard to tell a priori, so the best I
have managed for myself is just to try and stay in theory, where developments
are further removed from direct consequence.

~~~
reacweb
If all the engineers who have a sense of ethic refuse to work for military
projects, it implies that military projects will be performed by less ethical
than average people. It is frightening. IMHO, avoiding military projects
because of ethic is counterproductive.

~~~
kevinmchugh
Richard Gatling thought his most famous invention would cause wars to be
shorter and less brutal. More deaths faster would mean faster surrenders, he
thought. He is a weapons inventor who thought he was doing something ethical.

Instead, he made war and death cheaper. He made it more likely. In his
lifetime, machine guns were deployed asymmetrically, making it cheaper for
colonial empires to kill large numbers of the colonized.

~~~
supertrope
Alfred Nobel thought so too with dynamite.

"Perhaps my factories will put an end to war sooner than your congresses: on
the day that two army corps can mutually annihilate each other in a second,
all civilised nations will surely recoil with horror and disband their
troops."

~~~
spdionis
Isn't that what is kinda happening with nuclear weapons though? I would argue
we have less armed conflict nowadays exactly for the reasoning in that quote.
Dynamite did not really allow for two army corps to mutually annihilate each
other in a second, nuclear weapons actually do.

~~~
kevinmchugh
Nuclear weapons didn't stop wars, and they didn't stop major powers from
fighting each other. It just moved to proxy wars. Proxy wars have been common
for a long time. Maybe they've stopped world wars, but maybe the two World
Wars were exceptions and not a guaranteed feature. I don't know how to
quantify whether there are greater or fewer armed conflicts between states
since the invention of nuclear weapons.

------
abalone
I completely support the employees but it is criminal that this NYT article
does not mention the extensive roots of Silicon Valley in Pentagon funding. SV
as a whole is _substantially a creation of military spending._

Just recently in the AI space:

\- Siri was spun out of a Pentagon project -- look up SRI International and
CALO. Its purpose: a "soldier’s servant".[1]

\- Autonomous driving is a direct evolution of Pentagon-funded efforts -- see
DARPA Grand Challenge.

And it's not just funding the early research, it's procurement like this too.
Military procurement has also supported the development of technologies when
the commercial market couldn't.

Again, I support the employees and hate the fact that in order to develop
medical lasers we first have to figure out how to shoot down missiles with
them. It's hugely inefficient, could spell our doom, and if you think about
it, fundamentally undemocratic. (Gives elites more power to direct taxpayer
dollars under the rubric of defense.) But this should be a basic part of any
story on how Silicon Valley works.

And to think that SV has a large population of supposedly "small government"
Libertarian Capitalists... oh, the mental gymnastics in that.

[1] [https://en.wikipedia.org/wiki/CALO](https://en.wikipedia.org/wiki/CALO)

~~~
random4369
> SV as a whole is substantially a creation of military spending.

This is a national budget problem. If you're in the tech world long enough, it
becomes pretty clear that the only way to get Big funding is through military
affiliation.

This puts a huge selective bias on what kind of technology projects actually
get big funding, and further it prevents the benefits of those projects from
reaching the community for years, because the military overlords demand
secrecy sole use of the technology until it gets superceded.

We need to cut a huge chunk out of the military budget and give it directly to
the tech sector, so that big innovative projects are actually possible without
having to be military.

~~~
TuringNYC
Doesn't the recent (last 10yrs) VC splash do that? Hard to tell numbers since
military spending has so many routes, but i'd love to see numbers on the two
channels.

~~~
abalone
No, they are not parallel channels. Generally speaking high tech develops over
a multi decade timeframe. Govt agencies like DARPA play a lead role in the
earliest phase (often measured in double digit years). VCs pick outputs that
have commercial potential and pour money into the sector.

That’s why it’s wrong to just compare the absolute amounts invested — it’s
_when_ it’s invested.

------
grellas
Where does one draw a line these days among the personal, the moral, the
legal, and the political?

The military application in question is legal and is approved by a duly
elected government that supports it politically. In earlier days, employees
generally would see this as just doing their jobs in developing technology
that their employers wanted developed and would not concern themselves about
ultimate uses and applications. In other words, doing your job is personal
and, as long as you do it honestly and work hard, you should not be faulted
for doing it as requested by your employer. That was always the standard. What
then is the new element from which this sort of employee-driven demand arises?
Is it morality? In other words, if I help develop A.I. that can be used for
all sorts of things, one of which happens to be military-related, is the
effort "evil" if the employer for whom I develop it agrees contractually to
provide it to the government for a wartime/military use that can kill people?
Do I really make a difference for the good if I convince my employer not to do
this if all this means is that the company down the street gets the contract
and the military gets the same results, albeit from a different vendor? If
this is so, then I assume that you as an employee can make no practical
difference in making the world better by insisting that your employer forego
this particular form of contracting opportunity. If you succeed, your employer
misses an opportunity but the evil you see being released into the world still
gets released. It just means that you do not personally contribute to the
development effort by which it is made possible.

Of course, it might theoretically be possible to persuade all persons working
in the field of A.I. to ban further work that directly helps the military. But
that would seem a practical impossibility. Many people in all countries
believe that military technology of all kinds is proper, legal, and
politically supportable for purposes of self-defense or for some other
overriding purpose they deem proper. And certainly, there are bad people
throughout the world who are eager to use any technology that comes their way
for overtly evil purposes such as misuse of an atomic bomb. Unless and until
human nature is fundamentally transformed, that will never change.

So, what is the answer in a country such as the United States where people and
companies have the freedom to develop A.I. for any lawful purpose and where
some inevitably will do so for a military purpose of which you disapprove?

You are then left only with a political solution: use political means to gain
control of the government and the military and apply the force of law to ban
the military use of which you disapprove.

So this is either a personal act of futility by the Google employees or it is
a case by which they cannot separate the personal from the political and
thereby insist that their employer sacrifice particular economic opportunities
to ensure that your personal actions do not support a political outcome of
which you disapprove.

Even then, does this mean that your employer should cease working on A.I.
altogether? For, just as cash is fungible, so too is technology. Every
improvement you make in A.I. might have an immediate use of x for your
employer but, as humans collectively do this for all sorts of improvements,
the results are there for the taking in the future for military applications
of all kinds. In other words, you cannot put your improvement in a box or
control it so as to limit its future uses (at least not in a free society).
The computing technology of recent decades undoubtedly has bettered many
aspects of life but it has also greatly magnified the lethality and utility of
military applications so as to make the world far less safe. And this was
inevitable unless a supervening agency were to have used forcible and
totalitarian means to suppress such technological development from inception.
Since no such supervening agency existed or even can exist in a free society,
does this mean that all engineers and technical developers have blood on their
hands because, ultimately, things they have done were used for applications of
which they disapproved? Of course it does not. Nor would people today working
on A.I. be held morally or legally responsible for ultimate downstream uses
made of their work of which they would not morally approve today.

But this brings us back full circle. In the long run, you cannot stop such
uses (or misuses) made from your technical development work. Nor can you be
held responsible for them even though you contributed to them in some remote
degree through your work efforts. Why then should it make a difference if your
direct work efforts for a company like Google are applied to a military
application of which you do not approve but which is legal, politically
approved by the governing authorities, and will happen anyway regardless of
whether Google is involved?

The puritans of old tried controlling the morality of others by shunning and
shaming and doing it to an extreme degree. They failed miserably in their
efforts because humanity is what it is and followed its own course without
regard to external religious constraints.

This sort of effort by Google employees is obviously different in that it is
not religiously driven but does it amount to anything more than a shunning-
and-shaming method for trying to impose one's sense of morality on others by
signaling that this way lies righteousness and everywhere else lies evil?

If this is what "don't be evil" now means, then Google will need lots of help
going forward because every cause under the sun can be used in the same way to
shun and shame. We then have management by a corporate board as may be swayed
to and fro by any organized protest of the moment.

Whatever this is, and however it might be defensible in "sending a message" or
whatever, it is a sure way to put a company at a competitive disadvantage
while accomplishing nothing practically. It may further _political_ goals but,
if those are the goals, better just to try to advance them directly and not by
attempting to shun and shame your employer (and your co-workers who may
disagree with you) into submission. The personal need not be political. If it
does become that way, a new form of puritanism will hold full sway to the
detriment of all.

~~~
monfrere
Your discussion of "legality" here is U.S.-centric. Google has offices in
Germany, France, Poland, Russia, Turkey, U.A.E., India, and China, to name a
few. How would you feel if Google worked on military technology for those
countries? Would you point out that it was inevitable that their militaries
would seek ways to use A.I.? Would you point out that the Turkish armed
forces' activities are permitted under Turkish law? Would you lament the
inability of Google employees to separate work from politics?

Google has users in almost all countries, and even our friends in other
liberal democracies do not see the U.S. military the same way we do. Perhaps
some Google users' family members have even been killed by the U.S. military.
This presents a perfectly reasonable business reason (one that has nothing to
do with "the personal, the moral, the legal, and the political") for Google to
turn down AI drone contracts.

~~~
nine_k
I bet neither China, nor Russia, nor the US would allow foreign citizens in
foreign countries develop any serious military technology for their armies.
Even their own citizens would be checked in detail before being allowed to
work on it.

If a technology is not under that kind of scrutiny, chances are its military
applications are... far-fetched.

~~~
fineline
Lots of military technology is traded between countries all the time, tanks,
planes, guns, boats, missiles, etc. China just bought a bunch of Su35 jet
fighters from Russia.

[http://nationalinterest.org/blog/the-buzz/chinas-air-
force-m...](http://nationalinterest.org/blog/the-buzz/chinas-air-force-may-
soon-get-more-russian-su-35-fighters-24823)

------
rahulmehta95
This may be an uncomfortable fact but people have surprisingly short memories:
the military funded the majority of the early advances in systems, networking,
and cryptography (and especially as a large part of the latter subject area,
invested heavily in fundamental, theoretical research). Not saying that I
disagree with the employees' opinion, just that DoD/Pentagon involvement in
artificial intelligence research shouldn't be viewed as a necessarily bad
thing. Many other major powers have heavily invested in AI across all fronts
(including military applications), and it would be stupid for the US to not
have one of its' largest strategic assets to not be part of the process.

~~~
danesparza
My memory isn't short.

You made a point of talking about cryptography. The US government also
classified crypto as munitions in order to control its usage and export:
[https://en.wikipedia.org/wiki/Export_of_cryptography_from_th...](https://en.wikipedia.org/wiki/Export_of_cryptography_from_the_United_States)

This had very negative effects on cryptography in general (see the FREAK
exploit:
[https://en.wikipedia.org/wiki/FREAK](https://en.wikipedia.org/wiki/FREAK) )

Also, why wouldn't this be employed to better monitor Stingray
([https://en.wikipedia.org/wiki/Stingray_phone_tracker](https://en.wikipedia.org/wiki/Stingray_phone_tracker))
systems?

I don't think the government (and specifically, the military) should be viewed
as a non-partisan actor when it comes to technology.

~~~
treebeard901
The classification of cryptography as munitions was an interesting choice. The
first amendment may protect domestic encryption capabilities as stated by the
wikipedia article. From a legal standpoint, does encryption being classified
as munitions also put it into second amendment territory? As information
becomes weaponized, it's an interesting thought. The first amendment is
arguably more vulnerable these days, so I wonder if perhaps a second amendment
argument could be made in support of domestic encryption. It certainly would
help with marketing the argument to a certain segment of the population.

~~~
tc313
IANAL but I’m pretty sure the second amendment only applies to guns; you can’t
walk around with an RPG on your shoulder.

~~~
rdtsc
It depends how you look at it. What was the purpose. Presumably after fighting
in the Revolutionary War the goal was not to give people hunting privileges,
they had something else in mind. From that point of view, maybe RPGs should be
allowed, but so should attack helicopters, surveillance capabilities, private
spy satellites, etc.

> IANAL but I’m pretty sure the second amendment only applies to guns; you
> can’t walk around with an RPG on your shoulder.

Surprisingly perhaps you can buy a military grade flamethrower complete with a
napalm package: [https://throwflame.com/](https://throwflame.com/) it's not
even classified as a firearm, and is illegal in only two states I think.

~~~
goldenkey
That flamethrower is awesome. What use does a flamethrower have though besides
burning buildings, that a normal gun wouldn't?

~~~
punchclockhero
Clearing brush. It's considered an agricultural tool IIRC.

~~~
reitanqild
Also for preburning (ahead of forest fires) IIRC.

~~~
ianamartin
Are you saying that literally fighting fire with fire is a thing?

I may have missed my calling . . . .

~~~
munk-a
I mean... yes?

That is where the saying comes from.

~~~
goldenkey
Never really thought about it. But it makes sense to cut a "gap" in the forest
using fire, to prevent a fire from crossing the gap. Once burned, something
aint gonna burn again...

------
sseth
Do we really want to test a theory that we can have a repeat of the 1930s when
the democracies fell far behind an autocratic regime in the arms race, and
again the autocratic regimes will not win? Democracies are in retreat around
the world, and I only hope we wake up before it is too late.

The military-industrial complex may have become far bigger and perhaps in some
ways a burden, but the world is a dangerous place and becoming more so. In my
view, there is a perfectly valid argument that working with the military is
the right thing to do.

At the same time if individuals have pacifist leanings and do not want to work
for the military, i would hope corporations respect that.

~~~
titzer
> Democracies are in retreat around the world, and I only hope we wake up
> before it is too late.

What are you advocating? Arming governments with autonomous killer drones in
order to "protect democracy"? Because the world is a "dangerous place"? I
don't even...

~~~
sseth
Well "governments" already have nuclear bombs, aircraft carriers, drones and
what have you. It's not some new concept. You can take it as given that
regimes you will never want to live under will be putting their best people in
using AI for military purposes.

It's dangerous to imagine one is sitting on some moral high ground by refusing
to work for the military. We all owe our freedom to previous generations of
scientists and engineers who worked for the military, sometimes inspite of
strong ethical concerns.

~~~
stale2002
If we already have nukes, why would we need anything else?

Nukes prevent any war from happening between any powerful nation.

The only use for the other types of weapons is to participate in some other
civil war in another country that we shouldn't be in.

So no, I don't care if we "fall behind" because we still have nukes and those
are enough to protect us in the conflicts that matter.

~~~
sseth
Nukes are an instrument of last resort, but you don't want to get to the point
where that is your only option. And it is possible that technical advances in
missile defence may blunt this advantage.

Being able to prevent friendly (and even neutral) countries from being bullied
or going over to the dark side is also important because ultimately the best
defence against rising autocratic forces is to ensure a broad alliance of
friendly, democratic countries who are secure against interference by
unfriendly regimes.

If a country like the USA can have its elections manipulated, how vulnerable
might smaller democracies be?

~~~
marnett
> If a country like the USA can have its elections manipulated, how vulnerable
> might smaller democracies be?

The CIA could most certainly help you answer that quite quantitatively.

------
evmar
I heard an interesting view on this. Engineers build technologies like
TensorFlow and demos of object recognition, which have obvious applications in
drone combat (just stuff your model into the missile targeting system). Yet
then when this tech is used for this purpose they're shocked, shocked -- and
as long as they're not specifically building the missile targeting systems
themselves they feel like their hands are clean?

I'm not even sure what the consequence of this argument is; pretty much
anything you build can indirectly contribute to the military industrial
complex, even something innocuous like dev infra. But I also don't think that
weak "everything is equivalent" argument means you're suddenly absolved of
responsibility. One thing I am pretty sure of is that it must feel awful to
waste your short time on earth on building tools specifically for killing.

~~~
frostwhale
It's nothing more than an attempt for Silicon Valley to pretend that they're a
bastion of pure innovation, where all advancements are only for good. It's
naive and extremely holier-than-thou.

~~~
Florin_Andrei
Well, that's one extreme.

The other extreme is that some folks actually have a conscience.

Reality is probably on a continuum in between.

~~~
gaius
_The other extreme is that some folks actually have a conscience_

Those people are not working in AdTech.

~~~
VikingCoder
You want to have that conversation with them? Because some of them are right
here. In fact some of them just got shot at yesterday.

Is it hyperbole, or do you honestly feel that trying to connect users with
products they might like is inherently unconscionable?

~~~
AndrewUnmuted
> trying to connect users with products they might like

It is delusional to suggest that this is the goal of ad-tech. The only people
thinking about the users and the products are the creative teams that make the
ads themselves.

Ad-tech, on the other hand, engages in dragnet data mining operations that
serve the creative/design teams at ad agencies. The way the ad-tech industry
has decided to go about things absolutely is inherently unconscionable. The
unrestrained nonsense they've unleashed on the web over the past two decades
have resulted in leaving nearly all internet users vulnerable to malicious
code, privacy loss, and other horrible things.

> some of them just got shot at yesterday

Nice appeal to emotion there. But how do you know what the people at YouTube
HQ did for a living? Do you have insider information? By all means, don't hold
back.

~~~
VikingCoder
Do you know anyone who works on ad-tech?

Or are you making uninformed hyperbolic statements?

> The unrestrained nonsense they've unleashed on the web over the past two
> decades

I feel sometimes like I'm the only one who remembers how awful, insulting, and
dangerous ads were before.

> But how do you know what the people at YouTube HQ did for a living? Do you
> have insider information? By all means, don't hold back.

I asked, "You want to have that conversation with them?"

I'll ask you the same thing.

You know some of them are here.

They're your peers.

But feel free to demonize them without talking to them.

~~~
AndrewUnmuted
I do know people who work in ad-tech. However, I don't see why that should
matter in the least. Who I know in the ad-tech industry is not pertinent to my
argument.

> how awful, insulting, and dangerous ads were before

You seem to be missing the point. Yes, advertisements are awful, insulting to
our intelligence, and dangerous psychologically. But that has nothing to do
with ad-tech. Your conflation of advertisement production with ad-tech is
especially curious, considering your strangely dogmatic defense of this
reprehensible industry.

It's even more curious that you didn't reference any of the things I actually
brought up in my post about the ad-tech industry. The content of an
advertisement is not nearly as important as the means by which people are
consuming the content. This is a fundamental premise of Marshall McLuhan, on
whose ideas much of the advertising industry has operated for decades now.
("The medium is the message.") The real problem with ad-tech is not the
advertising content that it helps propagate, but rather the techniques
employed by ad-tech to achieve its goals. Why don't you respond to those
problems I raised, rather than ones that are both wholly irrelevant to the
discussion and entirely absent from my comment? Ad-tech is what introduced
malicious code, unethical privacy breaches, and absurdly non-scientific
measurement practices to the advertisement industry as a whole. The ads
distributed on yesterday's television and radio broadcasts, and on
yesteryear's magazines and newspapers, could never have come close to the
destruction today's internet ads achieve - because those older media were not
capable of being leveraged as irresponsibly as ad-tech leverages the world
wide web.

~~~
VikingCoder
> However, I don't see why that should matter in the least. Who I know in the
> ad-tech industry is not pertinent to my argument.

Of course it should. You denied any users X exist, (Ad-Tech who think about
the users.) I was hoping to demonstrate that your sample size of X that you
know well enough to judge them so, is too small to be meaningful. I say that
because I know many people in Ad-Tech, and almost all of them care very much
about users. Since our conclusions are different, I can conclude our samples
are different populations, or you're speaking in hyperbole.

You said:

> The only people thinking about the users and the products are the creative
> teams that make the ads themselves. ... is inherently unconscionable.

If you know people in Ad-Tech, and you tell them, "You don't care about users,
and your business is inherently unconscionable" I'd like to hear what
responses you get. Genuinely. Would they agree with you? Or, more likely would
they say something like, "At my place, even when I think about the user, it
doesn't matter much," which seems much more likely the kind of human response
you'd get.

> It's even more curious

You opened with bald-faced hyperbole. I'm trying to get you to admit that your
most outrageous claims are based on nothing but lies. Once I do that, maybe
I'll dive into the rest of what you said.

My concession to you is that there are awful ways to do ads - exploitative,
manipulative, bad for the user. And there are ways to do ads that are not
those things. You claim there's no difference - it's all homogeneously bad.
That's hyperbole or delusion.

Moreover, none of it is particularly helpful. It's just spite. You don't have
any actionable proposals for making the world a better place. You're just
making "dead lawyer" jokes.

And I did respond directly, reminding you that ads used to be far worse. You
haven't responded to that at all.

------
robert_foss
As an engineer I have serious moral qualms about furthering the goals of any
military. Contributing means being complicit.

The fact that the military funds research doesn't change that.

~~~
criley2
As an engineer I have serious moral qualms about not furthering the goals of
our military.

After all, I would not be here, and my job would not exist, and the freedom I
have to do this work would not exist, if soldiers like my grandfather hadn't
stepped up to Hitler and Stalin and Mao and say "No, you're wrong, and we're
willing to use violence to protect our way of life". Just my 2c but I
definitely respect the ultimate sacrifice that my grandparents generation gave
to create this quite nifty Liberty bubble that these few generations get to
play in.

~~~
goatlover
They didn't step up to Stalin and Mao. Hitler was because of alliance with
Britain and the Japanese attack, not because of how awful the Nazis regime
was. The US for the most part didn't want to get dragged back into another
European conflict. Hitler was bad, yes. But what happened in Germany was
rooted in the outcome of WW1, which was rooted in all the alliances and what
not proceeding it, which was rooting in yet prior conflicts. A lot of which
had to do with the colonial powers fighting over how to divide up the world,
or because of religious differences.

A pacifist might say Hitler, Stalin and Mao are just symptoms of the deeper
problem, which is the willingness of human beings to engage in violent
conflicts just because somebody tells them to.

~~~
ixtli
It's amazing the degree to which people are ignorant of what you've said here
and imagine the US was motivated by ideology.

~~~
fastball
The US _was_ motivated by ideology.

Why was the US an ally of GB/France/etc and not Germany/Italy? Contrary to
your belief, greed can be ideological -- striving for global free trade is an
example of this.

~~~
mijamo
Then why wasn't the US in the war from the beginning? Did it take 2 years to
discover that Hitler was bad?

~~~
Anon1096
The US was always on the allied side, through funding and supplies. The whole
reason the Lusitania was sunk was because the US was sending arms to support
the Allied war effort, even if it wasn't officially declared. Even if it
wasn't until Pearl Harbor that the US joined the war officially, there is
little doubt what side Roosevelt and much of the American people wanted to
win.

~~~
scarmig
...the Lusitania was two decades before Hitler even came to power.

~~~
Anon1096
You're right. I messed up WWI vs WWII for a second there. The rest of the
comment stands though that the US provides major aid to the Allied powers.

------
vowelless
On one hand, I reluctantly appreciate Trump's election because it will force
silicon valley to think many times before readily giving up user privacy to
the US government. I felt like the attitude was very lax under Obama (despite
the Snowden revelations).

On the other hand, this bothers me a bit because it continues to allow people
in the valley to maintain a (sorry to use this word) delusion that what they
are doing is "moral". If Google stops working with the Pentagon after this
petition, people in the valley will pat their backs and enjoy how they are
making the world a better place. They will not have any incentive to rethink
the sale of user data to advertisers, creating highly addictive mentally
harmful products, etc.

Overall, it's good that at least people in the valley are somewhat mindful of
their actions and care about society (compared my current industry, finance).
I hope they can be successful at a deeper level.

~~~
IAmEveryone
> allow people in the valley to maintain a (sorry to use this word) delusion
> that what they are doing is "moral"

People refusing to work on military technology isn't "appearing moral". It's
just "moral", at least if you consider war to be generally amoral.

People here and elsewhere have been throwing the term "virtue signalling"
around a lot in the last year or so. This is a perfect counterexample: Google
employees speaking out against military tech are shouldering the risk of
appearing disloyal to their employer. If Google were to cut their military
ties, they would forgo whatever benefit they previously sought in that
relationship.

~~~
adsgjlnionio
Do you think the US should stop developing military technology? I doubt it,
and I doubt many Google employees do, either. I doubt they would be happy to
live in a world where the U.S. has a weaker military than countries like China
or Russia. I also doubt they would approve of throwing out our advanced
targeting technology in favor of Vietnam-style carpet bombing.

They are _uncomfortable_ with killing people even though a technologically-
advanced military is probably a good thing. They are _comfortable_ with
tracking and manipulating millions of people even though it's definitely a bad
thing. This is squeamishness, not morality.

~~~
teen
.

~~~
sureaboutthis
If that would ever happen, we wouldn't be talking about this at all. Reality
check!

------
jcwayne
Serious question for those who agree with this petition. Which country's
military would you like to see be the most technologically advanced in 20
years?

~~~
IAmEveryone
I'd vote for Belgium.

More seriously: the US has nuclear weapons, rendering them safe from any
direct confrontation with enemies challenging them technologically.

They have a conventional military 10x the size needed for the sort of
intervention that actually succeeded in the past (the Balkans, the first Gulf
war). The rather unsuccessful wars (Afghanistan, Iraq) were lost not for
technological reasons or any other dimension of military might, but because
the US military sucks at convincing people of the benefits of democracy.

Want to win the next ten wars? Put the money into Hollywood blockbusters and
the world's best Halal fast food chain.

~~~
friedman23
>I'd vote for Belgium.

I always laugh when people suggest these meek tiny European states should be
in charge of running the world and how much more civilized they are. The
reality is that these countries aren't meek by choice and we already know from
history what a country like Belgium would do if given a tiny sliver of power.

~~~
givehimagun
I would focus on the word tiny in your statement. America has to solve
problems at scale. California has 3 million more people than Canada.

America is a federation of 50 states. The EU is trying to become a federation
of the European states.

------
linuxftw
> Google Workers Urge C.E.O. To Pull Out of Pentagon A.I. Project

That moment when you realize your mega-corp employer's "culture" is merely
just another tool to get your to do their bidding.

How do the share holders feel? I'm sure they're betting that almost every
employee will stay no matter how their technology is used.

~~~
ocdtrekkie
Indeed, they know few to none of these petitioners will quit. 3,000 petition
signatures will do nothing. 3,000 resignations would, but Googlers aren't
going to pass up on the high paychecks, three five-star meals a day, and
massages in the office based on moral grounds.

(In the same vein, Google's NetPAC has been quietly donating the max to
Republicans like Paul Ryan and Steve Scalise, while most of their engineers
openly oppose them.)

~~~
magneticnorth
I know at least one Googler who really will quit if Google continues to
contribute to drone warfare research.

Yes, they like the high paychecks, five-star meals and massages, but the
market for ML and AI experts is very strong right now. The high paycheck can
be matched elsewhere, and the benefits of not contributing to drone warfare
research are worth spending a bit of that money on food and massages.

But as I said, I only know of one for sure. Whether or not there are enough to
sway Google is yet to be seen.

~~~
stevehawk
Since Google isn't going to stop this work, I'd love to hear when this person
actually leaves.

~~~
geodel
I would be more interested if 10% of petitioners leave. Single accomplished
person can easily move defending their moral position.

------
nyxtom
I for one would rather have a drone strike program that actually actively
avoids civilian casualties as much as possible. There are certainly gray areas
here as far as the use of that program from a political standpoint (whether
the strikes are warranted or whether it is part of regime changes). On the
other hand, by not helping the military become more efficient we also risk
losing existing lives (our own and civilian casualties) due to a lack of
efficient analysis. We already use statistical analysis and many other methods
(human and otherwise) to determine where to make military strikes, might as
well improve on this to make fewer mistakes where possible (as gray as that
may seem).

~~~
titzer
I am not able to understand because I'm stuck at "the government uses
autonomous robots to kill people extra-judiciously with missiles from the sky
and nobody seems to care."

And you want to make the program "more efficient."

This is the gulf which I cannot seem to bridge.

Extra-judicial killings of "terrorists" in "bad places" using flying killer
robots is just batshit insane to me. I cannot for the life of me understand
how that conversation would go with, e.g. Thomas Jefferson or Alexander
Hamilton. Or really anyone who believes in human rights.

~~~
wilsonnb
Why do you refer to them as extra-judicial killings?

~~~
ocdtrekkie
A non-military entity (the CIA), which should not be operating military
hardware (but does), assassinates targets without due process (or really, any
hard evidence at all), in countries we aren't at war with (formally or
otherwise). Depending on our relationship with the country in question, we may
or may not bother to let that country know we're going to kill some of their
citizens ahead of time.

~~~
wilsonnb
Putting aside moral issues with the CIA, what's the legal reason that they
should not be operating military hardware?

As far as I know, CIA assassinations of foreign citizens aren't illegal as
long as they take place outside of the US.

There was at least one instance of a drone strike killing one or two American
citizens which is definitely extra-judicial. Most drone strikes do not kill
American citizens so referring to drone strikes in general as extra-judicial
doesn't seem accurate to me.

I would like to clarify that I don't necessarily think that the US's use of
drone strikes is morally correct. I just think it's not correct to refer to it
as extra-judicial.

~~~
ocdtrekkie
It certainly is illegal in the country in which they do it. As far as I know,
pretty much all foreign intelligence work is technically illegal where it's
conducted.

~~~
wilsonnb
That is technically true but I do not think that's what the original poster
meant by "extra-judicial".

------
ChuckMcM
While I appreciate the engineers speaking out, it isn't really practical for a
company the size of Google, with the resources it has, to _not_ have programs
that work with the military in one form or another.

If, as an engineer, it is against your moral code to do any work that supports
the military, your choices are limited to working in small companies where
everyone is focused on the commercial products and services you are
delivering. And even then, as some games companies found out, the CEO might do
some collaborative work with the military for training or something.

It should come as no surprise that Google teams up with the Federal government
on things.

~~~
Eridrus
> While I appreciate the engineers speaking out, it isn't really practical for
> a company the size of Google, with the resources it has, to not have
> programs that work with the military in one form or another.

This is nonsense, Google has plenty of revenue and other non-defense avenues
for growth.

Companies are not beholden to chase every last dollar, especially if it could
alienate their workforce.

I don't have the same anti-defense leanings as others around here, but to say
Google has no choice is ridiculous.

Google probably couldn't (and shouldn't IMO) stop the DoD using Android phones
if they want to, but they can certainly refuse to work with the defense
sector.

~~~
s3r3nity
I agree with you somewhat, but I imagine the shareholders would revolt against
upper-management from the significant loss in revenue + media backlash that
would be inevitable ("Google hates the US government!" etc etc.)

As a comparison, imagine if Tim Cook took a stronger privacy stance in China
that we have in the states on moral grounds, and the result was China banning
all Apple sales there. He would be tossed immediately, even though most of the
comments here would be in support of the ethical position.

~~~
icebraining
1) The shareholders are part of Google.

2) A very small group of people control all the votes in Google, thanks to
different classes of shares. While this is not foolproof, courts rarely
overrule decisions.

3) You can incur in loss of revenue too if a non-insignificant part of your
workforce is resentful.

------
CharlesMerriam2
Business Ethics is the art of finding a company whose ethics are close enough
to your own.

Asking the business to reconsider before leaving is legitimate.

------
mtgx
This has been by far one of the most shameful things Google has done.

And their excuse that "they're only analyzing images" is a joke. Analyzing the
images and "identifying objects" (their words) is probably 95-99% of the job
in a drone strike. So if they're trying to make people think that their role
is minimal in drone strikes, they're failing hard at that.

I also think Eric Schimidt, who until last December was both Alphabet Chairman
_and_ working for the Pentagon, played a big role in this. Now he's still a
"technical advisor" but if he continues working for the Pentagon I'd prefer he
has no official affiliation with Alphabet.

It seems Schmidt would like Google/Alphabet and Pentagon to have a much deeper
relationship, if you can read between the lines in this post:

[http://www.defenseone.com/technology/2017/01/pentagon-
needs-...](http://www.defenseone.com/technology/2017/01/pentagon-needs-its-
own-google-all-its-data-says-eric-schmidt/134456/)

~~~
SmirkingRevenge
So is it simply per se shameful to develop technology for the military, or is
there a more nuanced moral argument that this sentiment comes from?

Or is there something specific to Google's Maven project that makes it
shameful, while other types of military commissioned projects might not be?

~~~
waisbrot
Slightly more nuanced. I think there is a clear difference between, say,
selling food to the military (the service you provide generally allows the
military to run better and the military is frequently responsible for killing
people) versus selling bullets to the military (the service you provide allows
the military to kill people).

Weapons-targeting systems clearly falls into the latter category. I've worked
on military-funded projects for machine translation and I don't feel guilty
about that. I would not work on projects (military-funded or otherwise) that
were used for targeting. I think that my position is internally-consistent.

~~~
SmirkingRevenge
Fair points. I think I agree with you that there is a distinction to be made
between say, selling food and making bullets - the latter being a direct tool
of violence.

Its perfectly reasonable for a person to decide they would rather not spend
their short time on earth developing such tools (especially if it would be
easy for bad actors to abuse them). I suppose that is the objection many
Googlers are making.

------
larkeith
> In an interview in November, Mr. Schmidt acknowledged “a general concern in
> the tech community of somehow the military-industrial complex using their
> stuff to kill people incorrectly, if you will.” He said he served on the
> board in part “to at least allow for communications to occur” and suggested
> that the military would “use this technology to help keep the country safe.”

I have a sneaking suspicion that Mr. Schmidt's views on what constitutes
"killing people incorrectly" and what is required to "keep this country safe"
both differ fairly significantly from those of the signatories of the letter.

~~~
samirillian
That is some menacing language. What exactly is the right way to kill a
person.

~~~
edanm
In self defense? In defense of others?

Those are considered cases where killing someone is legitimate by almost
everyone.

~~~
samirillian
America Defends itself and others with Predators. Power loves rubbing your
face in these kinds of absurdities.

------
jcoffland
I am against war and violence but even a cursory review of the last 10k years
of human history make it abundantly clear that the threat Iran, North Korea or
Russia pose to the Western World is real. I hate many of the things the US has
done but I'm glad the world is not controlled by even worse actors. The sole
reason for US and UN dominance is military power. If we ignore this the West
will be overrun.

------
1ec32c71876c47
If you're an engineer working for Google, I think you can most likely afford
to be an idealist, and in my opinion that's a much better option than just
blindly following orders.

Don't most of us want less war? Does it really _drive you_ to enhance the
capabilities of those committing acts of violence? How does one start to
believe that by supporting the military you are somehow working towards a
better world?

If you're not going to be an idealist, then in all likelihood you're helping
to work towards the vision of an idealist with more power than you.

~~~
olleromam91
I think the problem can stem from the urge for everyone to try and find the
exact same ideal world. It's so obviously clear that this is impossible, and
so obvious that there is no civil way for anyone to enforce an ideology across
a massive population. IMO, the best hope is for each man to pursue their ideal
world for themselves. What else can we really control?

------
ehudla
The famous public letter from Norbert Wiener declaring he will not work for
the military: [http://lanl-the-back-
story.blogspot.com/2013/08/a-scientist-...](http://lanl-the-back-
story.blogspot.com/2013/08/a-scientist-rebels.html)

------
jryan49
Maybe if they were more accurate they would kill less people?

~~~
tclancy
Maybe if they're more accurate the marginal cost of killing people in other
parts of the world will go down even more so we'll be even less likely to
reflect on whether the fact we can easily use robots to kill people on the
other side of the world who have no way of responding symmetrically might come
back to bite us in the ass?

~~~
airstrike
Nobody cares how many dimes those drones cost. War isn’t fought using the
cheapest means available, but the most effective. Have you seen the size of
the U.S. military budget?

This country will continue to meddle in international affairs and, on
occasion, get involved in bloody battles. Such is life. War is an integral
part of our civilization and will continue to be so for the foreseeable
future. You can’t just wish it away.

~~~
davesque
Wrong attitude. It seems that, at this time in history, we have more tools at
our disposal than ever before to avoid war. We have more capacity to organize
and communicate now. Why should we continue to just assume that war is
inevitable? I'm sure people had the same attitude toward the black plague in
the middle ages. They probably thought "Oh well! All this bad air and holy
wrath is inevitable. No way to avoid the plague!"

~~~
MikkoFinell
You're right about everything. But consider ISIS, what kind of communication
should we perform that will make future groups like that stop beheading people
and raping children? There are situations where violence unfortunately is the
only option left, and in that case you will want it to be as targeted and
clean as possible to avoid collateral damage. Currently drone strikes is the
best tech for that purpose, and we have an incentive to make it better for the
reasons previously mentioned.

~~~
opportune
If the US only were using drones against ISIS then that would be a different
story. The fact of the matter is that the US uses drones on much less morally
cut-and-dry groups than ISIS, and will likely continue to do so in the future.
It's like if a farmer wanted you to make them a shotgun so they could kill the
wolves terrorizing their flock but you knew they were also going to use that
shotgun to rob their neighbor for oil

~~~
olleromam91
I see your analogy. But it's not totally perfect, because in a (hopefully
functioning) democracy, the farmer can be told what to do by his constituents.
Obviously this doesn't always happen, but do we trust the entity we voted for
(USA.gov), or the one that makes money off our existence (google.com).

I don't always know.

------
ohiovr
This is is not the same kind of technology as better materials science for gun
barrels, or hypergolic fuels for ICBM maneuvering thrusters. This is about a
software that constitutes a decision making process. Who dies is not up to you
or a commander, but by a automatic software process. Debugging the code means
innocent people are probably going to die to do it.

------
hedgenetFT
Just as with a nuclear arsenal, what keeps the peace is a balance of power
between the nations. If one nations has access to a military technology that
allows it to destroy it's ennemies and steal it's resources without fear of
retaliation, it WILL use it.

Our well meaning google friends might believe they are doing the right thing,
but they are just doing what's fashionable instead of really thinking the
consequences of these actions. Making ourselves weaker in the AI race augments
the probability of war instead of diminishing it.

------
timsim09
I support Google workers. The hubris of US Military is unprecedented. Our
military is already trying to pick a fight mostly for no reason. The more the
cost of war drops for them (less killed personnel) the more they will create
devastation, look Middle East since Iraq invasion.

~~~
gebeeson
The US Military has zero hubris in and of itself. The 'US Military' has so
many moving parts and sub-cultures, not including all the support systems,
contractors and so on. On a unit by unit basis; it is just a job for many many
people. You might be talking about the big picture 'US Military' \- it doesn't
wield itself either.

------
wackspurt
Relevant here: "Why don't I take military funding?" by Prof. Benjamin Kuipers.

[https://web.eecs.umich.edu/~kuipers/opinions/no-military-
fun...](https://web.eecs.umich.edu/~kuipers/opinions/no-military-funding.html)

From the introduction: """ Mostly it's a testimony that it's possible to have
a successful career in computer science without taking military funding. My
position has its roots in the Vietnam War, when I was a conscientious
objector, did alternative service instead of submitting to the draft, and
joined the Society of Friends (Quakers). During the 1980s and 90s, the
position seemed to lose some of its urgency, so it became more of a testimony
about career paths.

Since September 11, 2001, all the urgency is back. The defense of our country
is at stake, so this testimony becomes critical. In short, I believe that non-
violent methods of conflict resolution provide the only methods for protecting
our country against the deadly threats we face in the long run. Military
action, with its inevitable consequences to civilian populations, creates and
fuels deadly threats, and therefore increases the danger that our country
faces. """

------
rasz
Im confused, do people working at Google not know the origin of their company?
There would be no Google if not for Pentagon funded military research.

Look up "Highlands Forum", DLI, MDDS. Sergey Brin was literally funded by NSA
and CIA under MDDS.

~~~
dredmorbius
Sometimes it's knowing your origins intimately which gives the strongest cause
to oppose them. There's the case of Robert Oppenheimer, father, and opponent,
of the atomic bomb.

[http://nuclearfiles.org/menu/key-issues/nuclear-
weapons/hist...](http://nuclearfiles.org/menu/key-issues/nuclear-
weapons/history/cold-war/oppenheimer-affair/oppenheimer-affair-intro.htm)

From the Soviet side, Andrei Sakarov.

[https://en.wikipedia.org/wiki/Andrei_Sakharov](https://en.wikipedia.org/wiki/Andrei_Sakharov)

------
bigjimslade
Thank you, Google employees who signed this letter! Let us hope the, ummm,
yea, "do no evil" company listens to you (and to its very own motto).

------
cjhanks
So it is ethical to track hundreds of millions of people to manipulate them
for profit. But unethical to defend power plants in the United States?

~~~
mashedvikings
It's also unethical to solve world hunger by feeding the poor to the hungry.
If defending is done not by installing firewalls, but by spying on everyone in
case they do something bad, you have a huge problem.

------
KarishmaKapoor
I have been encouraging my friends and family to discourage their kids from
working for US Army, Pentagon and similar organisations.

For a country like USA it is stupid to waste human capital on fighting someone
else's wars.

------
marricks
I think it's very important to consider the increase in military spending over
the past decade and decrease in funding for universities.

More and more funding in the military means more funding for specifically
defense projects rather than straight up knowledge or public good.

What is the long term affect of this? Perhaps research far more focused on
destruction instead of public good. More better drones, less general knowledge
or cures for diseases?

It's a sad state of affairs, and workers standing up against military research
within their companies is a good first step.

\---

One of the first sources I found on this is below, but I've specifically heard
about it being vexing from AI researchers, as many don't want to directly
support military applications, but don't have much of an opportunities for
funding otherwise.

[https://www.bu.edu/research/articles/funding-for-
scientific-...](https://www.bu.edu/research/articles/funding-for-scientific-
research/)

------
yuhong
My favorite:

[https://www.nytimes.com/2017/08/30/us/politics/eric-
schmidt-...](https://www.nytimes.com/2017/08/30/us/politics/eric-schmidt-
google-new-america.html)

I wonder if Eric Schmidt left Google because of this.

~~~
mashedvikings
Not sure if the article claims Eric Schmidt was ousted. But Schmidt has worked
for the pentagon since 2016 [http://money.cnn.com/2016/03/02/technology/eric-
schmidt-pent...](http://money.cnn.com/2016/03/02/technology/eric-schmidt-
pentagon/index.html)

~~~
yuhong
He left quite a bit later after the article was posted. I was wondering
whether this article helped.

------
booblik
So it says it will improve drone strike accuracy. Isn’t it a good thing? I
mean there are currently a lot of civilian casualties from drone strikes, but
that doesn’t stop anyone from using them anyway. Wouldn’t at least improving
accuracy make thinks better in some regard?

~~~
kiliantics
replace "drone strike accuracy" with something like "zyklon b efficacy"

~~~
drak0n1c
Improving the accuracy of munitions targeting enemy military forces hiding in
a building that neighbors civilian buildings/shields is not the same as
genocide.

------
sqdbps
"As Google defends its contracts from internal dissent, its competitors have
not been shy about publicizing their own work on defense projects. Amazon
touts its image recognition work with the Department of Defense, and Microsoft
has promoted the fact that its cloud technology won a contract to handle
classified information for every branch of the military and defense agencies."

Google should stop hiring activists and start hiring pragmatists.

~~~
tzakrajs
"Google won't take a defense contract and has shed defense projects in the
past" was a common theme at Google. Recruiters and employees actively
advertised this as a feature of Google's not being evil.

~~~
sqdbps
They should stop doing that.

I mean who's stopping China from using AI powered military robots? they should
think about the bigger picture.

Also I think they should invoke AI powered military robots in recruiting,
what's cooler than that?!

~~~
wz1000
"Don't say that he's hypocritical

Rather say that he's apolitical

Once the rockets are up, who cares where they come down?

That's not my department, says says Werner von Braun"

[https://www.youtube.com/watch?v=QEJ9HrZq7Ro](https://www.youtube.com/watch?v=QEJ9HrZq7Ro)

------
pnw_hazor
Working with China and other despotic regimes is cool, though?

~~~
ionised
What military technology is Google currently developing for the Chinese
military?

~~~
pnw_hazor
Everything in China is for the Chinese military.

------
dqpb
Are the Google employees working on this project all American? Google's
workforce is heavily multinational, but I would imagine a pentagon project
would require some kind of security clearance?

~~~
AndyNemmity
I think you'd be surprised, I've worked on a few government security clearance
roles where 80-90% of below management staff were Indian.

------
tempz
Better headline:

"Google Workers Astonished: Found Out They Work In Capitalist System, Not Nerd
Commune"

------
jay-anderson
> several Google employees familiar with the letter would speak of it only on
> the condition of anonymity, saying they were concerned about retaliation

This is too bad that the fear of retaliation exists for a letter like this. Is
this actually the case? Or did the author read into them wanting to remain
anonymous incorrectly? Perhaps they didn't want to become a spokesman for this
issue outside the company, but are fine making their opinion known to
coworkers and managers.

------
AndrewKemendo
Oh boy. Being on the inside of Maven, I can't tell you how confused these
Googlers are.

~~~
otterley
If you're not going to explain for the rest of us, what was the point of this
comment?

~~~
AndrewKemendo
To communicate that the people on the inside of Google who are making these
protestations are grossly uninformed about the Google-Maven relationship.

Choose to believe me or not. My background is all out there for people to see.

~~~
Darthy
When it comes to life-or-death questions, we should make informed decisions.
"Just trust me" fundamentally lacks any valuable information that furthers the
discussion. In addition, "just trust me" also endangers less clear thinking
people to sway based on emotion instead of well-founded cold hard facts.

~~~
AndrewKemendo
The reality of Defense/Intelligence is that you do have to just trust some of
us.

~~~
notburnt
The reality of being a citizen of the US is that the intel community doesn't
make it easy to do so.

------
skyland
Everyone needs to fear the military industrial complex. Its bad for every
business that is not in the business of war and killing. I'm sorry but today
is unlike anything we have ever witnessed. While Military Industrial Complex
was one thing in 1950, today, its a totally different beast and our society
can't take much more beast.

~~~
skyland
We should all be in the business of peace and prosperity, not war and
destruction. If we aim to build a better world that is. What will it take, my
fellow geniuses, to build a peaceful world?

------
mrybczyn
Google+Alphabet yearly revenues are in the same ballpark as South Africa's
GDP.

Google/Alphabet is already a country in its own right, and libertarians would
already be sceptical of Google's role in society.

This is just another step down a long road of Technocracy...

Also, wasn't "Don't be evil" dropped as a motto? I thought I read that.

~~~
airstrike
Countries aren’t defined by the size of their economy but, at the very least,
the monopoly of violence. Google doesn’t hold such a monopoly and likely never
will...

~~~
jadedhacker
Power is more complex than the direct application of violence. Corporate
relationships with governments can induce them, and often have induced,
violent reactions against people harming a company's interests.

~~~
airstrike
That does not a country make.

~~~
jadedhacker
Which do you care more about? Power and who has it or the label "country"?

------
ThomPete
Their entire career choice and their current sky-high yearly salaries plus the
stock price are based on what fundamentally was a military need i.e. the
internet (or the TCP/IP protocol to be exact).

The right course of action IMO would be to leave Google if you are unsatisfied
with working for Pentagon.

But Google HAS to work for the Pentagon and especially on AI because the
knowledge that comes out of this isn't only going to be military but to a much
higher degree and relevant extent for civilized use.

Technology is always a double-edged sword and we have to learn to deal with
the powers it gives us. But to not be on the forefront of technology when you
can is a losers game and Google would lose much of its value if it isn't at
the forefront.

------
nell
This is a tricky problem and there is no easy position to take. On one hand
you don’t want anyone to have such lethal power. You could argue the atomic
bomb is the warning we should learn from.

On the other hand, this needs to be evaluated in the context of “balance of
power”. Can Alibaba really refuse helping Chinese military. If not the Chinese
are going to gain the power regardless of whether US has it or not. Any
advanced country could gain such power.

These other entities may not follow the same ethical guidelines. Are we then
proposing the US military should be less powerful than its rivals?

------
mistahchris
> There’s a strong libertarian ethos among tech folks...

This does not ring true in my experience with "tech folks" in recent times. It
may have been true at one time, but I think libertarianism in tech has fallen
out of vogue.

Also, I take issue with distrusting how government (or any powerful
organization) makes use of technology puts someone into the libertarian
political philosophy according to this article. A healthy distrust of how
powerful groups use technology makes a lot of sense for anyone and is
definitely not something exclusive to libertarians.

------
mgpetkov
I want to remind those Google employees that they are able to work at Google
and enjoy very high living standard and obviously freedom of speech just
because they are protected by the US army.

------
75dvtwin
>"...The letter, which is circulating inside Google and has garnered more than
3,100 signatures, reflects a culture clash between Silicon Valley and the
federal government..."

If Google still continues to engage in the 'Business of war', would these
3,100 employees walk out?

Would they, then, also refuse to pay taxes to the US Fed Gov (which, obviously
spends them, on the business of war) ?

------
daodedickinson
Seems like the two most likely consequences would be more inaccurate drone
strikes and a higher risk of further expansion of the territory controlled by
Communist Chinese hegemony. I don't think these Google employees want that so
I'd like to see their thoughts on why they think the US military having worse
AI would bring about a better universe.

------
myrandomcomment
Totally agree! We should stop trying to make a weapon more accurate and go
back to fire bombing complete cities. Only way to be sure.

------
Jd
To me this illustrates a huge opportunity: big tech can't effectively engage
with public good projects (e.g. typical government territory). What's needed
is a tech gov superlayer that provides an umbrella for these services and can
also guarantee their internal integrity. The lack of guarantee over this last
point is one major problem here.

------
pmarreck
But wouldn’t drone strike accuracy improvement actually reduce civilian
casualties?

Given that these strikes will occur one way or the other, isn't this stance
actually adding more misery to the world?

(So do drone strikes in general. Much better to negotiate at a table with a
mediator or use a sporting event to resolve disputes, but I don't currently
run things here.)

~~~
titzer
> Given that these strikes will occur one way or the other,

Holy poopers, no. Drone strikes are not like premarital sex. The fact that
we're _here_ (that being autonomous killer robots firing missiles from the sky
to kill "bad guys") and no one bats an eyelash, means we're already in some
kind of massive fail scenario.

We could, for example, go back to the rule of law. You know, before we
believed that every bad guy out there was going to end civilization and
deprive us of our "freedom." Before we decided blowing up wedding parties in
Pakistan was such a great use of our time and money, and every Allahu-Ackbar-
kaboom moment is a pervasive existential threat.

~~~
pmarreck
I kind of agree with you. This is dystopian at this point. And I don't see how
permitting this sort of thing wouldn't have massive consequences down the
road, but then again I'm the guy who thinks intense negotiation at a table
with a mediator, and/or deciding a dispute with some sort of sporting event,
is a superior solution to simply butchering people with explosives en masse.

~~~
jackaroe78
When I was little I asked my dad why people couldn't settle things with a
snowball fight or something instead of killing each other in wars, he said
because there's always somebody out there that will put a rock in the
snowball. That's always stuck with me

~~~
pmarreck
They called those "loaded" snowballs

------
jwillmer
As someone who is hesitant of using (free) cloud services there you pay with
your information I really like that there are some thoughtful and aware people
working at Google. If it turns out that they have the power to affect
decisions like this in there company I might choose to use more Google
products.

------
HugoDaniel
Apple recently hired the Google A.I. Chief[1]. I wonder how much this is
related.

[1] [https://www.nytimes.com/2018/04/03/business/apple-hires-
goog...](https://www.nytimes.com/2018/04/03/business/apple-hires-googles-ai-
chief.html)

------
sabmalik
I keep seeing the argument "More accurate weapons would kill fewer people". By
that logic, it would be perfectly okay for Google to sell this technology to
the enemy states so those guys could kill fewer Americans. This sounds
ridiculous to me but it's likely I am missing something.

------
Dowwie
My condolences to leaders who organized this. They'll find in their management
review that because a fly farted in their work areas they haven't achieved
performance on par with their peers. At next round of layoffs, they'll be
released.

------
oldpond
So 3100 out of 57000 employees signed the letter. That makes google only 95%
evil.

------
GiorgioG
Google is a huge company, turning down big money from the US Gov't for legal
purposes is probably not a realistic option (at what point would shareholders
sue if they turned down hundreds of millions of dollars in revenue?)

~~~
IAmEveryone
They can sue all they want. But they'll lose.

A petition by 3000 employees (or, to be honest, 30) is enough to claim that it
was in the company's interest to forgo this revenue to improve their standing
among potential hires. Remember that tech companies live and die by the
quality of their workforce.

Moreover, the idea that companies always have to act according to strict
profit motivations is a myth. Just look at the billions corporations spend
each year on charitable causes.

Yes, you could claim that these donations are done for PR purposes. But that
would render your argument meaningless, because I can reframe any such
altruistic actions in terms of PR. I have also personally witnessed many
decisions that cost money but served some greater good and were never
publicised.

~~~
AndyNemmity
It isn't a myth that companies have to act due to strict profit motivations.
It's law.

The example you used on charitable causes, falls directly into marketing.

~~~
IAmEveryone
It's a law in Delaware, only. It is almost never litigated. You haven't
addressed my point that any altruistic motive can be reframed as marketing and
that therefore the law is largely meaningless.

------
mattigames
I wrote a little article related to this matter; I submitted it to HN on:
[https://news.ycombinator.com/item?id=16760768](https://news.ycombinator.com/item?id=16760768)

------
kiliantics
And this is why tech workers need to unionise. This petition is a weak
statement that probably will have no effect. Unless these workers could pose a
real threat to Google by threatening a strike action, Google has no need to
change anything.

Furthermore, if the workers really stood by their moral convictions, they'd
use their collective power to tackle the issue more directly than by just
appealing to Google. Companies like Google are like the railroads of the 19th
century. They comprise the major infrastructure of modern society. A union at
Google could threaten to effectively halt all institutional operations of
governments or other companies in order to influence their actions. Is
facebook manipulating voters? Okay, no more gmail for facebook till they stop.
Is the government going to bomb Yemen? Alright, we're shutting down government
accounts.

As others in the thread have also said, I'm doubtful many of these signatories
will quit their cozy jobs if Google doesn't back down. Without the group
solidarity and pressure from a union, most individual workers just don't have
any good incentive to put their money where their mouth is.

~~~
chrisseaton
> A union at Google could threaten to effectively halt all institutional
> operations of governments or other companies in order to influence their
> actions.

But this seems an amoral thing to do, to me! A small group of unelected,
unaccountable and non-diverse people abusing their relative power to try to
force a democratically elected government or private company to do something?
Why would anyone want that?

~~~
kiliantics
Currently, Google and other large companies already do have power and
influence over what the government does. But only a tiny portion of people at
these companies are the ones deciding what to do with that power. It would be
preferable if the entire company took part in deciding how it would influence
what the government does. It would be even more preferable if workers across
whole industries and whole sections of society together used their collective
power to influence what the government can do. Ideally, these large federated
unions would also do so with the interests of the
unemployed/disabled/disenfranchised as well. This is what the IWW does. Tech
workers should join the IWW.

~~~
chrisseaton
> It would be even more preferable if workers across whole industries and
> whole sections of society together used their collective power to influence
> what the government can do.

Everyone in society is already part of one big union that can influence
government - the electorate. Everyone gets one vote. If you try to force the
government to do something it wouldn't do otherwise you're trying to get power
beyond your one vote. That's why it's an abuse.

------
pawal
Anybody looking forward to the captchas asking us to identify drone targets?

------
wintom
During world war 2, fan favorite companies like Mercedez Benz were building
tanks for the Nazis. If there is a war today don’t you think the government
will seize everything and we will all be working for them? This is different
in some ways because Google is likely being paid for this but I don’t think
they can just refuse nor do I think they should be able to, frankly. That’s
not how national security works.

And just because employees at Google refuse, employees in one of the big
Chinese firms won’t be able to. If our government does not have this tech then
someone else will.

We live in this fantasy, like people aren’t dying everyday from war in so many
places around the world. We are not immune to that.

Edit: to all the folks down voting me it’s a good idea to get a different
perspective sometimes.

~~~
alanh
> if there is a war today

Wow, we really are insulated from our military today, aren’t we? Yes, the US
government is still at war in Afghanistan and with ISIS.

------
oh_sigh
From what I can tell, there are ~90k google employees[1]. So, 3% of the
employees signed this. I feel like I could find 3000 people out of almost 100k
that believe in almost anything, especially if it is generically anti-
military, or anti-USG.

I doubt this would happen, but I wish these only petitions would include
counter-petitions, which people could sign if they _don 't_ agree with the
petition, so we could get an estimate of whether this is just a very vocal
minority of Googlers or viewpoints are split.

[1] [https://www.statista.com/statistics/273744/number-of-full-
ti...](https://www.statista.com/statistics/273744/number-of-full-time-google-
employees/)

------
no_wizard
The fact remains regardless of historical anecdotes or advances made in the
past that in today’s economic landscape one of the few steadily profitable and
maintainable businesses is doing work for the Department of Defense.
Regardless of who is president this is an organization that regularly gets
well over 200+ billion in funding and much of what they do has little
oversight. If you are a business large or small and can land a long term DOD
contract I’ve seen businesses that needed capital but couldn’t raise it pivot
to a well funded DOD project and then able to reap those profits into their
other businesses successful.

It’s a sad state really but this is probably the main idea behind partnering
with them in the first place

------
HacklesRaised
Well this is it, n'est ce pas? Is google really about "do no evil" or will
capitalism melt the snowflakes. Either way, googles fate is of its own making.

------
JabavuAdams
I wonder what Geoff Hinton's position on this is. He seems somewhat anti-
military from some side comments in his Neural Networks course.

~~~
YetAnotherNick
What does anti military even mean? I can get anti war, but don't you want
government has the power to stop terrorist, and military is needed for that.

~~~
geodel
Well It would mean we can respectfully disagree with terrorists and ask them
to not blow things up and engage in bilateral or multilateral talks with
governments which they have issues with.

------
eloop
These employees are right to be worried. If Google becomes a defense
contractor I'll be going out of my way to stop using their products. I'm not a
pacifist, but I'll always choose to work for and patronize peaceful
enterprises, it's as simple as that. I also influence lots of non-technical
friends and family in their IT purchases and habits and I'll definitely be
warning them off as well. So Google, be less greedy and make the world a
better place for us all.

------
Animats
The future of border security could be drones flying along the Mexican border,
taking out intruders with precisely aimed headshots.

------
palisade
Not helping them improve precision of the drone strikes reduces the precision
of those strikes which leads to more innocent lives lost as collateral. An
argument could be made that using AI to improve surgical strike capability of
drones would be the lesser of the two evils. Seems misguided, no pun intended.

Also, Google dropped the slogan "Don't be evil" in favor of "Do the right
thing." Not that it matters because they barely followed it anyway.

~~~
titzer
I hate this line of reasoning, because in the end I could suggest that the
government install remote kill switches in every single human being's brain
and be able to precisely kill anyone with zero collateral damage at any point
in time. I mean, why would we settle for anything less but absolute
perfection? Let's get to it!

------
ElijahLynn
If I worked for Google and found out I was working for the Pentagon, my line
would be crossed and I would resign.

------
thetruthseeker1
I also think that disengaging is the wrong approach. Google once disengaged
with China because they didn’t agree with China on some freedom of expression
laws and all the management now think that it was a bad decision. In similar
vein, if you are not part of it, somebody else will shape the outcome of
this.. It should be rather you if you really care and could play a role in
minimizing the negative impact of such tech

~~~
ForHackernews
This sounds like some next-level rationalization to me. You can justify
anything by theorizing that somebody else will do the same-thing-but-even-
worse:

"If we don't employ child labour, somebody else will! It should be us, so we
can at least make sure our child-slaves get fed once a day!"

The fact of the matter is that Google has rare talent in AI. There _isn 't_
another company with their kind of expertise waiting in line behind them, and
even if there were, sometimes you should keep your hands clean.

~~~
thetruthseeker1
Your comparison is like comparing apples and oranges. You can’t do child labor
because there are laws in the US to prevent that. If you lobby to pass a law
that prevents AI for defense purposes then that would be similar. Here the US
govt is a client, if not google they will go to some-other company for that
tech as long as there is no law that prevents that.

Also, if they don’t go to google, sure they may not see as much of a rapid
growth in AI tech, but if you think they will never be able to get anywhere
significant, you sir are underestimating others IMO

------
downrightmike
AI is needed, the sooner we can make or create it to hold consciousness the
better. We're constantly changing as a species and what we are 10,000 years
from now, will far more different than we are to the ancient Egyptians. The
universe will go on with out us and if intelligence is rare and spaced far
apart AI is really the only chance that we have to be able to communicate or
even find other intelligences.

~~~
kiliantics
you seem to be in the wrong thread

~~~
goatlover
AI all the topics.

------
stanislavb
The beginning... we basically can't avoid this. If it isn't Google, it will be
someone else.

------
zitterbewegung
I think this is a good idea because it would be great optics for Google to
cancel this project.

------
RoutinePlayer
Qualifying rahulmehta95's comment below a bit. The government and military
ought to be accountable to the citizenry, not the other way around. Having
said that, DOD will just find other companies to continue this research, so it
actually behooves Google to stays on and to actually contribute in making the
technology more accurate.

------
sneak
So this is about data processing of video feeds used to target.

If you think about it, we (including Google employees) have known Google has
been doing this in a different way to assist in waging war ever since Snowden
showed us those slides. Google processes data for the government that allows
them to know who to target for war.

------
sova
Engineer brains must be allocated to benefitting Humanity.

Ease the lives of your fellow traveler.

------
m3kw9
Their tech is already used in all kinds of war activities indirectly.

------
knowThySelfx
Google can't pull out. That'd be like being ungrateful.

------
crb002
A.I. is commodity compute. You can't pull out.

~~~
dredmorbius
The genies of chemical, biological, radiological, and nuclear warfare, as well
as genocde, ave been reasonably well re-bottled.

Even Sun Tzu notes that warfare is not total.

------
wuxb
The CEO wants to match the evil of Mark Z.

------
zaroth
It boggles my mind, with the current state of affairs, the number of people
actively organizing to eliminate or curtail our fundamental human rights.

It seems to me that 1st and 2nd amendment rights are both under particularly
intense bombardment these days.

~~~
richmarr
I'm not sure if you meant to imply that the 2nd amendment is a fundamental
human right but if so I think that's a bit of a stretch.

'Fundamental rights' (under US law) I think is defensible as fundamental is
treated as an alias for constitutional so guns are in.

'Fundamental _human_ rights' to me implies broad and fundamental
applicability, i.e. the UN declarations and covenants on human rights, which
don't iterate guns as a right.

Case in point, when non-US countries talk about human rights, none of them are
talking about guns.

~~~
harimau777
I'd argue that fundamental human rights means rights that exist inherently
regardless of whether the UN or any given country recognizes them.

My feeling is that people have a right to self defense, which would require
them to have access to some effective weapon. In theory that could be a club
or blade rather than a firearm; however, currently firearms are the only
(somewhat) socially acceptable weapon to carry in the united states.

~~~
tolgerias
self defense is a human right only in a lawless society. The existence of the
state makes it necessary for it to have the monopoly of the application of
force, and all matters should be resolved in a civil court of law. Only in the
USA is it normal to think that people should have the right to kill in self
defense. Even more, many think they should be able to defend themselves from
the state.. while that is the definition of a lawless society (a society where
the state has no way to impose the rule of law). I get it, guns are fun. But a
human right? Definitely not

~~~
tolgerias
I can only reply to myself (low score?) but my point was not that self defense
is not justifiable in extreme cases (though I made that point. I didn't choose
my words wisely). What I mean is that by no means is the citizen expected to
engage in vigilante justice or apply force against the state. That leads to
lawlessness and was kind of hinted by the parent comment. The state should
have the monopoly of force _to apply in a lawful manner and mantain the rule
of law_

~~~
lovich
The state decides what a lawful manner is, thats about as useful a statement
as "it's not illegal when the president does it". The states only rules as
long as a certain X% of the populace agress with it. The second amendment
tries to balance that X so that it doesn't become something like o ly 1% of
the populace needs to agree with the government for it to maintain power.

There's arguments to be made as to what percentage of the populace needs to
agree for a stable society to be formed. Both extremes of 100% and 0% are
obviously bad as you either get no agreement or total dictatorship.

Just saying that you should listen to the government though ignores the
majority of the history of governments

------
make3
Democracies need top of the line militaires to exist. Plain and simple.

------
meri_dian
>"...that uses artificial intelligence to interpret video imagery and could be
used to improve the targeting of drone strikes."

These drone strikes have been happening for decades without interpretive AI,
and they will probably continue with or without. So let's make the strikes
more precise and save more civilians.

>"But improved analysis of drone video could be used to pick out human targets
for strikes, while also better identifying civilians to reduce the accidental
killing of innocent people."

~~~
shadofx
The need for human operators to identify targets serves as a hard logistical
limit to how many drone strikes can be delivered. Turning them into a machine
removes that limit.

Suppose you decrease error rate by 10% (optimistic), but increase volume by
20% (probably underestimated). You'll get an 8% increase in innocent deaths.

~~~
walshemj
The hard logistical limit is the cost and the number of airframes.

~~~
titzer
No, it actually is the amount of eyeballs to analyze drone footage. They
literally have so much footage feeding in that they can no longer review it
with humans. This tech is designed to make the surveillance dragnet feeding
into military decisions more automated and efficient.

------
mankash666
Well then, Google employees should start with moving out of the US. Because,
they seem to benefit and enjoy the safety and security provided to them by the
Pentagon.

Grow the fuck up - part of living in a democracy is tolerating the
implementation of measures one disagrees with. Republicans might oppose birth
control, but companies continue to support it in their health plans. Likewise,
a state of the art offensive military is democratically wanted in the US,
state your disagreement and tolerate it's implementation

~~~
tclancy
>Grow the fuck up - part of living in a democracy is tolerating the
implementation of measures one disagrees with

And tolerating the opinions of others.

~~~
mankash666
Are you mistaking a swear word for intolerance?

~~~
tclancy
No, the suggestion that any conflicting viewpoint is a sign of immaturity. No
one would ever suspect someone who can't speak to a fucking stranger without
cursing struggles with maturity as well.

~~~
titzer
Not to mention, encourages them to leave the country, or at the very least,
state their objections and then put up with something they deeply morally
disagree with (and basically violates international laws on human rights). Oh,
and it's an action that's probably self-defeating. Offensive military
capabilities indeed.

