
The Pentagon’s ‘Terminator Conundrum’: Robots That Could Kill on Their Own - snewman
http://www.nytimes.com/2016/10/26/us/pentagon-artificial-intelligence-terminator.html
======
themgt
_“China and Russia are developing battle networks that are as good as our own.
They can see as far as ours can see; they can throw guided munitions as far as
we can,” said Robert O. Work, the deputy defense secretary, who has been a
driving force for the development of autonomous weapons. “What we want to do
is just make sure that we would be able to win as quickly as we have been able
to do in the past.” ... The weapons, in the Pentagon’s vision, would be less
like the Terminator and more like the comic-book superhero Iron Man_

Such blatant propaganda by the NYT. Just like with Stuxnet, the US is pouring
black budget dollars into novel asymmetric offensive weapons that'll be
rapidly put to use in the field, all under the guise of "defense". It's
grossly irresponsible.

In the position of sole superpower barely hanging onto our role with the
obvious rise of a multipolar world, the USA should be pulling all its strings
to put in place international institutions and laws that will uphold stability
and the type of global civilization we've become used to. Instead we're
throwing money that should be rebuilding infrastructure and paying for
people's health care into crazy destabilizing weapons systems where we'll have
a 5 year "strategic advantage" headstart on the tech but the precedents we set
(a cyberweapon that infected computers worldwide, Terminator autonomous
weapons?) will come back to haunt us and the rest of the world.

~~~
djsumdog
America's manned drone programs are an outright war-crime. This video from
2012 is telling:

[https://www.youtube.com/watch?v=SJ46ZkJY8oo](https://www.youtube.com/watch?v=SJ46ZkJY8oo)

..and sadly nothing has changed. Obama has continued to authorize the use of
drones to kill countless people we know little to nothing about, without any
type of oversight. Make no mistake; no matter who wins the coming election,
this will continue and escalate. It will not make the world safer.

Wasn't China our allay? I'm pretty sure they make all of our stuff ..
including many of the electronic components that make their way into drones.

The Russian rhetoric hasn't changed at all since the Cold War when Regan's
Team B was making up all kinds of technologies the Russians didn't actually
have in order to continue defence spending (see the documentary: The Power of
Nightmares).

The type of propaganda we see obfuscates the reality of western foreign
policy. US supported rebels in Syria decapitated a kid who looks to be about
12 ~ 13 years old. The BBC cropped the photo to remove the kid and then
claimed they killed a dangerous young combatant. (source:
[https://www.youtube.com/watch?v=mzEsDAoUvOI](https://www.youtube.com/watch?v=mzEsDAoUvOI))

View every attempt to justify more war and more bombings critically. They will
not make the world safer.

~~~
rhino369
The focus on drone programs is misguided. Drones are the best way to conduct a
war and minimize casualties, of both US soldiers and foreign civilians.
Traditional air strikes, artillary, ground assaults, or airborn assaults would
result in much more death and destruction.

I expect you really just believe the US shouldn't be fighting the Taliban in
the tribal area of Pakistan. But since we are, drones are the best way to do
it.

Just look at what Russian intervention in Syria looks like. Or Saudi
intervention in Yemen looks like. Drones are much less destructive.

~~~
djsumdog
No, they're equally destructive as standard air assaults. They've taken the
place of manned bombing runs and created an additional layer of abstraction
between pilots and the people they kill.

You know the best way to minimize causalities? Stop creating wars. In thirty
years from now, I'd put money that declassified papers will show the US and
allies explicitly created ISIS. It fits the pattern (1973 US led military
coupe in Chile that killed 11,000 civilians, the Iranian Contra Scandal, The
School of the Americas, the failed Bay of Pigs, the "Weapons of Mass
Destruction" in Iraq .. the list is as long as you want to make it).

You're right, drones do have less causalities than ground troops for
instilling fear and creating needless wars. So let's step back and stop the
needless wars.

~~~
rhino369
They have greater time over target which allows greater time to accurately
pick a target. Sure, a hellfire is a hellfire, but a hellfire is a small
missile. Traditional strike fighters and bombers use larger munitions. You
could send in Apaches but they are vulnerable to anti-air attacks.

People always say the layer of abstraction is a problem, but I've never heard
a good reason articulated.

Look at the carnage that Russian and Saudi air campaigns are doing in Syria
and Yemen. Drone campaigns are much less dangerous.

------
Shivetya
I have no issue with creation of such systems, they will come whether or not
we do anything. The issue is as always, what will the other guy actually do.
Then you get rogue actors. So in that train of thought I doubt seriously any
treaty will prevent them from being developed and even deployed. Now they may
actually become beneficial in the field as they could distinguish between
friend or foe and needless to say losing one of these compared to a soldier is
preferable if it makes the wrong choice (it wasn't friendly)

What I would prefer to see is rules, by treaty or law, that prevents law
enforcement agencies from implementing any automaton that can kill or
otherwise harm. Any action where a robotic unit is employed to incapacitate or
harm up to and including death should require a human operator with regards to
law enforcement.

I also see the great benefit to having this technology be employed in fire
fighting, fire rescue, and disaster assistance.

tl;dr there is no stopping the military from having these but there needs to
be means to stop law enforcement

------
mobiuscog
It's not a conundrum.

They will convince themselves that they have the power to prevent any problems
(or deal with them).

As with most controversial technologies, they just have to be seen to be
concerned with what _could_ happen. Until it does happen, it's not a problem.

~~~
beisner
This is why autonomous cars are making their way into the market, despite the
fact that many worried that they'd get tied up in regulatory, moral, or
technical issues.

~~~
ChuckMcM
Ok, perhaps I am misreading this, are you implying that "they"[1] are pushing
self driving cars in order to give the public confidence that machines can be
trusted with life or death decisions? The actual goal being to make it
possible to deploy weapon systems that will decide on their own when to kill
without the public disbelieving their assurances that the machines will make
the correct decisions?

That would be a pretty deep game if so.

[1] The forces of evil that really run things

~~~
Jtsummers
Or, perhaps, they mean the technology needed for self-driving cars is also the
essential technology for a more automated war machine. The military-industrial
complex, then, has a vested interest in this technology developing (see
various DARPA challenges in the field of self-driving vehicles).

~~~
sqeaky
I just read it as both groups have a similar mindset. Each thinks they can get
short term gains, so they will.

I disagree with the assertion that either is fundamentally bad though.

Cars are the most lethal thing in the western world (unless you count natural
causes like cancer or heart failure), even if the first generation of self
driving cars kills a hug number of people it will be a platform for building
safer cars in the future.

Autonomous weapons there is room for debate there, even I think it reduce
overall human suffering and death.

------
inputcoffee
How can this be worse than a bomb which "decides" to kill anything within a
certain radius without any human being individually looking around and making
individual decisions about everyone in that radius?

~~~
tyingq
I get your point, but in your example there was some human driven decision to
approve dropping the weapon into that specific radius.

The article is talking about AI driven scanning, selection, and
engagement...with no human driven decision in the process at all.

The open letter from a group of AI practitioners that's mentioned in the
article sums up the concern well: [http://futureoflife.org/open-letter-
autonomous-weapons/](http://futureoflife.org/open-letter-autonomous-weapons/)

~~~
digi_owl
Didn't the long range Harpoon launches back during the first gulf war do that?

Give it a coordinate to hit, some radar map data to find its way with, and
send it on its way.

~~~
tyingq
"Give it a coordinate to hit"

Part of the concern with the truly autonomous AI driven weapons is that they
could patrol large areas and select their own targets, with whatever arbitrary
rules you wanted.

~~~
clarry
Ultimately, the rules are set by people.

I don't see how a machine executing arbitrary rules set by people is that
different from people executing arbitrary orders given by people.

Just a week ago, two Belgian fighters allegedly bombed a civilian target in
Syria, killing four and and further injuring two innocent people. That's a
very arbitrary action to take, yet somebody must've ordered the pilots to do
that. Perhaps somebody expected there to be armed terrorists there? If they
had sent an autonomous robot, with the rules and training to target actually
armed targets (how would a robot tell they're terrorists though?), perhaps
these civilians would still be fine.

~~~
tyingq
"Ultimately, the rules are set by people"

Maybe. AI is tricky. Ask someone on the Google search team to show you,
definitively, why each of the top 10 results for a query are where they are.
They can't.

------
codazoda
Like others have said, these are inevitable.

One thing I worry about in warfare is that some enemies have less work to do
because they have a different moral compass. Take a terrorist organization for
example. If they build a similar drone it does not need to identify armed
people, only people or even just movement. That is a much simpler problem to
solve. If they can get a drone to their enemies it can kill indiscriminately.

If the enemy has few people in the area or they are willing to die for their
cause, their robot doesn't have to worry about who's armed and who isn't.

~~~
Jtsummers
ISIL is already reported to be using drones with improvised weaponry. It's
only a matter of time, now, before this becomes more ubiquitous for both sides
in wars.

[http://www.bloomberg.com/news/articles/2016-07-07/armed-
dron...](http://www.bloomberg.com/news/articles/2016-07-07/armed-drones-used-
by-islamic-state-posing-new-threat-in-iraq)

[http://www.aljazeera.com/news/2016/10/isil-drone-deadly-
iraq...](http://www.aljazeera.com/news/2016/10/isil-drone-deadly-iraq-
attack-161012151854280.html)

------
throw2016
I think there's an inevitability about robot armies and police. Unfortunately
it will enable scenarios for oppression and dehumanization.

Power always concentrates itself and there is an eternal struggle between the
powerful and those below. This is one more tool for the powerful and a
transformative one.

I don't see how populations can have a response to millions of bots and will
be easily supressed. Never depend on the goodwill of the powerful or anyone.
There is always someone who will be tempted to abuse and there is always
someone who will give in.

------
hellbanner
From a sci-fi anthology:

AI fighter jet. Entities on its team are tagged green. Entities attacked by or
attacking green units are tagged red.

Rule for interacting with a new entity: If it hurts a green, tag it red. If it
hurts a red, tag it green.

Hurt red, protect green.

The bombing assignments increase in their collateral damage until green starts
to far outweigh the red. In its emergent processing, the AI fighter AI decides
Command & Control back home is a big Red, flies under the radar and destroys
the military base.

\---

This is another concern for military making automated systems - unexpected
consequences & they may be easier to subvert than human prisoners on capture.

~~~
Balgair
Don't forget Kafka. The military in it's current state, and in almost every
iteration beforehand, is a giant mess of insanity and Kafkaesque motivations.
I'd say that the current way things work make it MUCH more likely that
thinking humans would make the same mistakes that were outlined at a much
higher rate. However, adding in these machines is not going to make things
better, it will only make them even more complicated and nutty. Remember
Catch-22? Now imagine that with Major Major Major's computer aide spitting out
meaningless stats and probabilities. No wonder beer is so popular with
soldiers.

------
shortstuffsushi
I'd really like to know more about the "target identification" software,
specifically how it differentiates a person holding a gun from someone holding
a shovel (or any stick object).

~~~
tim333
I guess they'd run a neural net with a training set of pictures of people with
guns and without?

Edit-

I don't know what they used but you can achieve somewhat similar results using
open source libraries like dlib or open cv:

[https://www.youtube.com/watch?v=-8-KCoOFfqs](https://www.youtube.com/watch?v=-8-KCoOFfqs)

[https://www.youtube.com/watch?v=pj-
QuE6pdEQ](https://www.youtube.com/watch?v=pj-QuE6pdEQ)

~~~
jstanley
And if the training set doesn't include people holding shovels?

~~~
MrZongle2
You're going to end up with some dead ditch-diggers.

And some software engineers to take the blame, rather than military brass.

~~~
posterboy
those engineers would probably be brassed

------
sqeaky
I find the comparison to submarines in WWII to drones to lacking in
intellectual honesty.

Treaties on new frontiers of war are common, we still have a treaty
prohibiting the militarization of space. Someday it will be broken. Will it be
a war-crime if it saves lives? What if ISIS has a satellite with nukes on it
or something equally life threatening, should we heed the treaty even though
doing so leads to our obvious demise? In WWII Japan was an existential threat
to the United States.

There is no similar existential threat with drones, just some hand waving
about China and Russia. And there are a dozen other ways the comparison breaks
down.

What precautions are we taking to avoid harm to civilians? How do we insure
transparency in this matter? How do we actually measure effectiveness? How do
we prevent malicious groups from getting this tech from recovered drones lost
in the field?

The comparison dodges all the really hard questions and just smears the US and
the US military.

------
fatlasp
"Kill Decision" by Daniel Suarez spins this topic in to a great work of
fiction.

------
Fricken
I don't really care what the Pentagon does, if they want to fuck shit up, they
have the capability, and autonomous weapons only offer them marginal gains
over what their multi-trillion dollar war machine can already do.

The cool thing how cheap and accessible these autonomous technologies for
anyone with limited resources. Any consumer accessible technology can be
hacked and jiggered to make an exciting array of autonomous weapons that offer
so much more precision with less cost and risk than any conventional approach.

------
tim333
I guess an upside to this stuff could be dealing with situation like ISIL
trying to do genocide stuff to the Yazidis
([https://en.wikipedia.org/wiki/Genocide_of_Yazidis_by_ISIL](https://en.wikipedia.org/wiki/Genocide_of_Yazidis_by_ISIL)).

You couldn't really bomb away because you'd kill the wrong side, you couldn't
send ground troops because ISIL would behead them on youtube whereas AI killer
quadcopters might get somewhere.

~~~
eternalban
Wondering who do you think created, funds, and arms "ISIL".

[p.s.]

Looking into these _facts_ is important and very much related to topic at
hand. "Terminators" are for urban pacification, but the pretext these days is
"terrorism" and "insurgents". It is likely a high probability event that your
own children will have to deal with these terminators, and not the "Russians"
or the "Chinese".

------
pc2g4d
The article doesn't make an important distinction: the computing power
required for AI applications like Siri (massive) far outstrips the computing
power available on a lightweight drone or "Terminator" weapon. Highly
sophisticated AI will not be on the battlefield without introducing a
dependency on a network connection to the centralized AI service. There's a
network-dependence / computational-capability tradeoff that will prove
important.

------
jwatte
Letting killer robots loose within an area of engagement is not that different
from strategic/carpet bombing. Both are bad news for any civilians stuck in
the area.

------
pasbesoin
Paywall...

Anyways, I'm reminded of viewing this from two sides: 1) Developing technology
that kills humans; 2) Telling technology that avoids killing humans that
"that's not a human".

If it's mechanically possible, someone's going to do it.

The one restraint is accountability. And, which direction is that going?

------
nashashmi
What's worse: Nukes or AI?

I think we need another Nevil-Shute-on-the-beach class book to illustrate the
reality of this.

------
samgranieri
No. Just don't do it.

------
josh_fyi
A pit-trap can kill "on its own." A robot might kill more people, it can carry
out more complex missions, but autonomy is not a key difference.

~~~
swsieber
I think autonomy is a key difference. A pit-trap can't come hunt you down.
That's new.

~~~
Fricken
One step up from pit traps are landmines. They kill or maim an estimated 2,000
people a month, autonomously. There's one active landmine for every 52 people
in the world. They're cheap- the most common ones range in price from $3-$30,
but they cost much more than that to remove and about 1 person is killed for
every 5,000 mines disabled. They won't hunt you down, but they're invisible
and can remain active for decades.

We don't need to veer off into speculative sci-fi territory to find autonomous
weapons doing terrible things.

------
venomsnake
I see no conundrum. Go for it. It is inevitable anyway.

Unlike nukes there is no MAD scenario here - so nothing to keep armies in
check. And it would give great tools for general area pacification - which any
aspiring dictator strongly desires.

