
The Case Against an Autonomous Military - dnetesn
http://nautil.us/blog/the-case-against-an-autonomous-military
======
vinceguidry
I don't see 'autonomous' weapons as being developed by the US military as ever
being banned under international law. Let us examine weapons that have been
banned.

Chemical and biological weapons were banned under international law because
they aren't all that much more effective than conventional weapons yet cause
an undue amount of suffering in comparison. You get no benefits, only
downsides.

The other example, anti-personnel mines, is more instructive. International
law bans them because historically, belligerents who have employed them for
area denial purposes never stick around and de-mine the area after the
conflict is over, leaving the civilian population to suffer.

The United States refuses to sign a blanket ban on landmines, as her position
is that her landmines are different, they self-destruct after a set period of
time, therefore they don't threaten civilians long after the war is over.

There are two takeaways here. First, weapons are not banned before they are
tested in a conflict. Second, effectiveness trumps morals, if a weapon of war
is effective, that can tip it's acceptability status even in the face of
"collateral" damage. This is the case with nuclear weaponry, which imposes
costs on civilians but is considered worth it for their sheer power to end a
conflict in minutes.

This is, after all, the entire premise upon which war is waged, for the good
of the many over the few who are harmed. AI controlled weapons, unless they're
left in a battlefield for civilians to contend with, something that appears
fairly unlikely, at least at the moment, does not change this basic equation.

~~~
skybrian
Calling this an "equation" seems overly fatalistic, as if it were inevitable.
I'm pessimistic about this as well, but there certainly are attempts to
prevent nuclear weapons from spreading, and efforts to reduce the arsenals of
countries that have them.

These attempts often don't work, but they do matter because sometimes they do.

~~~
vinceguidry
People that I know in the military wish strong states with strong weaponry to
act as a deterrent to conflict. They trust in the professionalism of other
states' armed forces and the diplomatic process, which evolved over centuries
of war to prevent accidents and human failings. This was greatly tested during
the Cold War and the policies and procedures held conflicts down to a relative
minimum, the unfortunate proxy war in Vietnam aside.

Once it became clear to planners that MAD wasn't a real scenario and that the
most effective strategy for nuclear weaponry involves smaller, more numerous
weapons, targeted at enemy military installations, focus shifted on ensuring
that enough weapons are available to fully end a conflict.

Disarmament agreements unfortunately ensure that there is strategic space for
a nuclear exchange to not fully end a conflict. This raises the probabilities
of major conflict rather than reduces them.

Great in theory, not in practice.

------
Eire_Banshee
We can argue about the implications all we want... but given the choice
between sending an American soldier overseas to die or a robot to get
destroyed... the choice is pretty obvious to me.

~~~
decebalus1
how about not sending anyone overseas? Why isn't this choice being presented?
Let's draw up the stakeholders of 'sending entities abroad to kill' and find
out why this choice is not considered in the first place instead of fighting
over the philosophy of who or what kills to push forward a sick foreign
policy.

I see this every day in American politics. Couple of days ago people were
fighting over what constitutes an 'assault riffle'. Mission accomplished: the
actual original debate (banning guns) was already abandoned.

~~~
pc86
"Banning guns" violates the USC and simply is not going to happen without a
Constitutional amendment, which politically will never happen to 2A.

~~~
Retric
Banning gun clips is viable, even cartridges seem to be optional as ball and
powder is clearly a gun. Single shot weapons are good enough for home defense
and vastly reduce the impact of gun violence.

~~~
AllegedAlec
> home defense

But this is not what the second amendment is about. In the words of Hammilton:

> This will not only lessen the call for military establishments, but if
> circumstances should at any time oblige the Government to form an army of
> any magnitude, that army can never be formidable to the liberties of the
> People, while there is a large body of citizens, little, if at all, inferior
> to them in discipline and the use of arms, who stand ready to defend their
> own rights, and those of their fellow-citizens.

~~~
amalcon
People make this mistake a lot (anti-gun people make it, too). The point was
never that the American people would be able to repel an oppressive force
using only their personal weaponry. That was ludicrous even back in the
1700's.

The key point is that, when the rebels inevitably capture weaponry or receive
foreign assistance in that form, they would be generally familiar with its
use. They would have skills like marksmanship, trigger discipline, and
maintenance. They might need to learn the specifics, but these skills largely
transfer.

Edit: Now, I don't think banning breech-loaded firearms makes sense, but that
is for reasons unrelated to the original purpose of the second amendment.

~~~
AllegedAlec
Fair enough. Granted, the quote I throw at OP also included the words 'little,
if at all, inferior to them in discipline and the use of arms', which isn't
all that applicable either...

------
breakpointalpha
The arguments presented are fairly weak.

The two main suppositions are that "AI can be used to target racial groups."
and "AI weapons could end up in the hands of ISIS, et. al."

While both of these are true, and certainly cause for concern, the author
greatly ignores the game theory aspect of AI powered weapons.

Namely, if our enemies are building them, we _must_ too.

I'm not trying to fear monger, but US and China are _already_ engaged in an AI
arms race. It's illogical for _either_ party at this point to stop
development.

[https://www.nytimes.com/2017/02/03/technology/artificial-
int...](https://www.nytimes.com/2017/02/03/technology/artificial-intelligence-
china-united-states.html)

~~~
pc86
So do you think the best solution is to return to a Cold War-era MAD
environment?

~~~
dantillberg
Does MAD apply to AI-centered weapons? I thought the Cold War's MAD
environment had more to do with the advantages that a nation would have if it
were to _preemptively_ attack, because the weapons act so fast over such long
distances.

~~~
Tyrek
MAD is predicated on the fact that ICBMs take time to arrive at their target,
and are effectively unstoppable. The idea is that if nuclear state A was to
target nuclear state B, B would be able to launch their own arsenal at A
before B was obliterated.

Additional methods of delivery such as sub nukes, strategic bombers, Carriers
with nukes, etc. further reinforce the concept as B would be able to retaliate
even after its destruction.

Now, the modern context has weakened MAD somewhat, because it's strictly an
interpretation of nation-state level nuclear conflict. With the proliferation
of (plausibly deniable) non-state actors in the 21st century, the concept
breaks down as it makes it much harder to 'attribute' a target to blanket nuke
in response.

------
bhouston
Arguing against an autonomous military is like arguing against machine guns or
any other major technological innovation that makes it easier to kill people.

The only things we have generally outlawed is mass killing devices such as
gases and nuclear weapons and I guess firebombing in retrospect (e.g.
Dresden.)

~~~
robotresearcher
Nuclear weapons are not outlawed. The US has thousands.

[https://en.wikipedia.org/wiki/Nuclear_weapons_of_the_United_...](https://en.wikipedia.org/wiki/Nuclear_weapons_of_the_United_States)

~~~
bhouston
They are at least controlled and not practically used at the moment. And most
people see them as lines not to be crossed.

~~~
robotresearcher
This is the relevant voluntary treaty:

[https://en.wikipedia.org/wiki/Treaty_on_the_Non-
Proliferatio...](https://en.wikipedia.org/wiki/Treaty_on_the_Non-
Proliferation_of_Nuclear_Weapons)

The treaty admits the US, Russia, UK, France and China to maintain nuclear
weapons. Israel, India and Pakistan never agreed, and North Korea acceded.

Some of those countries can deliver a nuke to anywhere on the planet in an
hour.

edit: added Israel to the list of non-signatories.

------
throwaway2016a
Unfortunately I don't know if it the the government we need to worry about.

We assume that the military will be the one to implement this... drone
technology is very cheap now (relatively) and so are image recognition
systems.

A rouge / non-state organization could easily and relatively cheaply create
automated turrets and automated drones. Heck, Youtube has tons of videos of
Americans doing it in their backyard just for "fun".

Would they be as good as a state military's $10MM drone? Probably not. But
when you can build 10k of them from the price of one, who cares?

Really we should be worried about both.

------
vlehto
>“autonomous weapons are ideal for tasks such as assassinations, destabilizing
nations, subduing populations and selectively killing a particular ethnic
group.”

This is not the first time this has come up. Even remotely controlled weapons
are incredibly dangerous. I think the solution is not to ban autonomous
weapons, they will be then used by non-state actors or "non-state actors".
Governments need to enable better protection of citizens privacy. Currently
any time I get mail, there is my physical address combined with my name. My
phone number, current location and name are also bundled and probably visible
to all kinds of people for all kinds of purposes.

It would be terribly easy to find my home address, tie small bomb to cheap
drone and crash that to my bedroom window while I'm asleep. It's a perfect
crime. All this could be done because I have said "I don't like nazies" or "I
oppose Sharia in my country" in facebook post.

------
eaxitect
Arguments are fairly weak. I think saying "It’s certainly not far-fetched to
suppose that a group like ISIS, for example, could acquire A.I. capabilities."
is almost equal to saying they could acquire F-22 fighters or B-2 bombers as
well. Developing sophisticated AI-based integrated weapon system is not that
easy.

------
tim333
On the other hand a case for autonomy is you could potentially kill less
people. Say the enemy have a bunch of tanks doing bad stuff, humans fighting
them are probably going to blow them up killing the crews before they can get
you. With AI you could maybe use smaller drone like things designed to disable
the tanks without killing the crews. And if the crews kill the drones who
cares - they are only drones, send some more. I quite like the idea of AI that
could take out military hardware and perhaps injure a few fighters without
killing anyone. Maybe like T2
[https://www.youtube.com/watch?v=kEztkckjRtE](https://www.youtube.com/watch?v=kEztkckjRtE)

------
3pt14159
We already have autonomous militaries. We have for a long time. Nuclear
Weapons don't have pilots, after all. I'm in favour autonomous militaries
amongst non-pariah states. At least insomuch as I'm in favour of militaries in
general: Temporary, necessary evils.

What is different is the plummeting costs of killbots. We should do everything
in our power to stop these from proliferating to lesser states or terrorist
groups. Just as we do with nuclear weapons. But at the end of the day
autonomous devices are going to kill people and terrorists are going to do
some of the killing. We need to accept that fact and design around it, not
plead that the reality be different.

~~~
crankylinuxuser
What crazier is I could (but WONT) whip up a pretty effective geneva
convention style device. A blinder. It's one of the most horrific styles of
weaponry - you don't even have the decency to kill them.

All you do is monitor motion via a webcam. Once you see motion, you turn on an
array of high power IR lasers with lenses to make an arc of laser light.
There's no blink response, and the damage is already done.

It would cost what, $500 total for the platform, including batteries for
lengthy operation off grid.

To be honest, I'd rather build stuff that bolsters all of humanity, and not
just 1 tribe other another. I wish we'd get out of that "us vs them" mindset.
Maybe someday.

~~~
3pt14159
Yeah, I've quietly / privately worried about this as well. One watt lasers are
easy to get. Though I think you're looking at way more than $500 for anything
reliable. Thankfully there are international agreements against using these
weapons on the battlefield.

------
astine
The best way to avid killing people in warfare is to avoid going to war, I
think. Though, to some degree conflict is always going to be inevitable so
we've got to prepare for it.

I don't really see a situation in which autonomous weapons would be much worse
that humans. Humans can (and do) conduct genocides. If the concern is some
rogue actor like North Korea getting new more powerful weapons, I don't see
the benefit for the West to avoid their research and development as other
countries could easily pursue autonomous weapons independently by applying
civilian AI techniques to warfare.

~~~
maxxxxx
Autonomous weapons are much worse. If in 2003 the US had had autonomous
weapons they probably would still be occupying Iraq because it would just cost
money and no (American) lives. Unnecessary wars will be much easier if they
only cost money.

------
pjc50
There's nowhere near enough discussion of the point that ""AI"" or other
weapons with autonomous fire control are a huge target for "cyberwarfare",
including outside of a conventional war.

While ISIS aren't going to develop their own AI they could concievably develop
some zero-days in US systems. Turn missile defence systems against civilian
airliners, for example.

------
dfsegoat
Relevant side analysis of the Rule of Law / International legal precedents RE:
autonomous weapons systems, for those who are interested:

[https://breakingdefense.com/2018/04/a-treaty-to-ban-
autonomo...](https://breakingdefense.com/2018/04/a-treaty-to-ban-autonomous-
intelligence-weapons/)

------
mtgx
Relevant recent story. Now imagine if these were autonomous:

[https://www.nbcnews.com/news/military/russia-has-figured-
out...](https://www.nbcnews.com/news/military/russia-has-figured-out-how-jam-
u-s-drones-syria-n863931)

------
ashleyn
I would think James Cameron articulated quite a compelling case against one 35
years ago.

------
greedo
Has there been any technology that hasn't been used in a military conflict?

~~~
rdl
We walked back from chem/bio to a great degree, but only really because we had
something bigger/better (nuclear).

A bunch of relatively useless things (blinding lasers, stuff like that) have
been eliminated/banned, but again mainly because they're not militarily
particularly useful.

~~~
chopin
I believe that, unfortunately, only weapons with little military use are
banned. Take, for instance, chemical weapons. These are pretty much useless in
modern (highly mobile) warfare.

Take another example, mines. There is a treaty on banning them but it has not
been ratified by the worst actors [1].

[1]
[https://en.wikipedia.org/wiki/List_of_parties_to_the_Ottawa_...](https://en.wikipedia.org/wiki/List_of_parties_to_the_Ottawa_Treaty)

~~~
greedo
Mines are extremely useful in modern warfare. The ability to shape an
opponents maneuverability is critical. Mines (especially stuff like DPICM)
allow defenses to attrit attackers.

------
ataturk
The Framers of the US Constitution were rightly concerned about standing
armies. Not that anyone cares about any of it at this point--we're all going
to go through hell again, probably nuclear hell, then people will understand.

Seriously, how much death and destruction does the US need to mete out on an
annual basis, regardless of drones or what have you? All this tech has done is
empower the worst, most brutal people.

