
Scientists call for ban on lethal, autonomous robots - pseudolus
https://www.theglobeandmail.com/business/technology/science/article-scientists-call-for-ban-on-lethal-autonomous-robots/
======
hannasanarion
Lethal robots are not a technology that can really be regulated by ban.

It's not like nuclear weapons, where you need a huge industrial operation with
tens of thousands of high-tech centrifuges running for years before you have
enough fissile material for a bomb. Even the rulebreakers, like Israel,
couldn't hide the fact that they were doing it and had to rely on America's
UNSC veto to save them from the consequences.

But lethal robots are a trivial step away from the current state of robotics.
Hobbyists have already mounted guns on quadcopters. Tesla and Waymo have made
robots that kill people accidentally. Any robot can be made a killer robot
with minimal change to its software and hardware.

~~~
ForHackernews
There's a big difference between a jerry-rigged bomb strapped to a quadcopter
and a purpose-built killbot produced by a nation state. Strong international
sanctions against warbots are a good idea, even if it won't prevent every
terrorist from making their own low-rent version.

~~~
chrisseaton
> produced by a nation state

Why do people always say ‘nation state’ in these situations? It’s not a fancy
word for ‘major country’ - it means something specific. For example the UK is
definitely not a nation state. The US is arguably not a nation state either,
due to its cultural and lingual heterogeneity and tribal sovereignty.

~~~
kkarakk
"nation state"

Dictionary result for nation state noun a sovereign state of which most of the
citizens or subjects are united also by factors which define a nation, such as
language or common descent.

so you can call anything a nation state as long as that thing issues a
passport i guess?

~~~
chrisseaton
> so you can call anything a nation state as long as that thing issues a
> passport i guess?

What? No - you just gave the argument against that!

> are united also by factors which define a nation, such as language or common
> descent

This means the UK isn't a nation state - we have four nations in the UK, with
different cultures, descents, and in some areas even languages.

It also means that the US is not a nation state - the people there have very
distinct descents. Today Americans literally still say they're 'Italian'
rather than 'American'. Many Americans don't speak English. The US literally
describes tribal lands as 'dependent nations', so it isn't one nation even by
their own federal definition.

Examples of nation states are places like Iceland and Portugal. These are the
major military players that the original comment was trying to refer to.

~~~
dannypgh
It's possible for a person to be a member of multiple nations. In the eyes of
many Americans (USian), American most certainly is a nation.

MAGA-style nationalism is quite popular and powerful at the moment...

~~~
chrisseaton
Yes establishing your own nation as a state is a goal for many nationalists.
And you can see some nationalists pointing at Japan (a nation-state) and
saying that they'd prefer their state to be more homogenous like Japan.

------
jsty
It's going to be interesting to see how the circle is squared regards
defending against autonomous weaponry. There's probably a large design space
of offensive autonomous weapons that could only effectively be countered by a
system with no human in the control loop. If the design / use of autonomous
defensive systems is also prohibited, that would give an overwhelming
advantage to any aggressor willing to defy any such a ban on offensive
systems. If they're not banned, then the difference between a weapon designed
to auto-target other weapons and one designed to auto-target humans is so
blurry as to be virtually indistinguishable in terms of verifiable arms
control.

~~~
TeMPOraL
> _If they 're not banned, then the difference between a weapon designed to
> auto-target other weapons and one designed to auto-target humans is so
> blurry as to be virtually indistinguishable in terms of verifiable arms
> control._

There's no such thing as purely defensive weapon. I don't remember who said
it, but I recall a quote that went similar to: "if your weapon can shoot enemy
planes over your cities, it can just as easily shoot enemy planes over _their_
cities".

~~~
falcor84
Well, what if your weapon is a fixed turret built on top of your city hall?

~~~
TeMPOraL
Put that turret on a trailer truck and drive to enemy town.

Or put your city hall on a trailer truck, if that's what it takes.

~~~
sergioj97
That's much easier to do if you develop a mobile turret from the beginning.
The idea is that there are technologies which are fairly harder to use
offensively than others.

------
arethuza
What about current weapons, such as the UK Brimstone missile that apparently:

"includes essentially the ability to find targets within a certain area (such
as those near friendly forces), and to self-destruct if it is unable to find a
target within the designated area."

[https://en.wikipedia.org/wiki/Brimstone_(missile)](https://en.wikipedia.org/wiki/Brimstone_\(missile\))

When does something become fully autonomous rather than temporarily
autonomous?

~~~
C1sc0cat
Exactly when a guided weapon is launched many of them are autonomous eg ATGM's
that do top attacks on MBT's

------
ptero
IMO wide, blanket bans on dual-use technologies (and many things in our high
tech lives can be effectively weaponized with some effort) do more harm than
good. Especially where, as in this case, violators would be hard to detect and
classify into "OK" or "bad" bins.

This would deter some researchers working on general topics (easy to label /
threaten as law breakers) while having no impact. Organized crime has bigger
laws they already broke and already have enough COTS parts to build nasty
things (and are often OK with 80% reliability, thus need no cutting edge
research). Many / most militaries would ignore such bans and fund development
quietly, justifying this by national security.

IMO the only way to make this approach tractable is to choose a _narrow_ list
of technologies we do not want developed and see if there are technologies
that exist or can be developed that would effectively detect violations. And
choosing even a narrow list of technologies to ban would make for lots of
debates and unhappy citizens. My 2c.

------
sevensor
I'm not really clear on what the moral difference is between an artillery
shell, a criuse missile, a drone strike, and an autonomous robot that targets
the wrong individual. As far as human agency is concerned, all three are
weapons of war that kill indiscriminantly. How is the autonomous system worse?
Is it the illusion that we might be able to delegate moral agency to the
machine?

~~~
_i____ii_______
Makes me think of
[https://en.wikipedia.org/wiki/Vasili_Arkhipov](https://en.wikipedia.org/wiki/Vasili_Arkhipov)

Imagine an AI in such a scenario. Count me as one of those people who thinks
we are headed into a kind of hell.

~~~
sevensor
I think this comes down to my last sentence -- delegating the decision to kill
to a robot is not really different from deciding to kill. But we may do it
anyway, and wash our hands. I suppose this is the source of the unease about
autonomous drones. Not that they're worse than other killing machines, but
that we might deceive ourselves into thinking that they're better.

Maybe a better analogy than a cruise missile would be a land mine. A device
that kills, possibly much later than when it's deployed, and often an
unintended target.

------
nurettin
Is this a proposed ban on AAMs, SAMs and all torpedos? Because they are all
basically exploding robots with advanced tactical systems to bypass defenses
and autonomous systems that track their targets and control their rotation.

~~~
Mirioron
Don't forget CIWS.

~~~
nurettin
That is a defensive weapon, but perhaps I forgot to mention ASMs, which are
normally released from stealth bombers and drones.

------
flippyhead
Currently reading a great book on this topic:

[https://www.amazon.com/Army-None-Autonomous-Weapons-
Future-e...](https://www.amazon.com/Army-None-Autonomous-Weapons-Future-
ebook/dp/B073VXYD5P/)

------
srazzaque
We often forget that by the time the tech world is playing with something, it
has perhaps already done the rounds in military.

Automated sentry guns on the Korean DMZ have existed since around 2003,
developed by a company that was associated with Samsung (not sure if it still
is).

Therefore this isn't, quote, "at best, a few years away". Unless that quote
was from around the year 2000.

[https://en.m.wikipedia.org/wiki/SGR-A1](https://en.m.wikipedia.org/wiki/SGR-A1)

~~~
orzig
And of course land mines have existed for decades longer, and “covered pit of
spikes” centuries before that.

No situation is totally new, though sometimes the equilibrium shifts so much
that the top 3 driving factors in an environment are qualitatively different

------
narrator
So this is why weapons have manual aim in sci-fi movies! You'd think that war
would be fully computerized in the future, but all sides must have decided
that was a bad idea.

~~~
nemof
i'm reminded of the prologue of Ian Bank's Excession which describes how a
ship Mind (a vast incredibly powerful AI) is overwhelmed and subverted, its
ship taken over by an excessive and overwhelming force, and only a solitary
drone is able to make it off the ship after running a gambit as the force
tries to stop the drone. most of the attack, the attempt to defend against it
and the final hail mary by the drone takes place within thousands of a second.

also i'm reminded of battlestar galactica and how deeply distrustful they are
of computers and tech because of everything that has happened to them.

~~~
EForEndeavour
Not to mention the Butlerian Jihad in the backstory of _Dune_, and the
enduring ban on thinking machines. "Thou shalt not make a machine in the
likeness of a human mind."

[https://en.wikipedia.org/wiki/Butlerian_Jihad](https://en.wikipedia.org/wiki/Butlerian_Jihad)

------
huffmsa
They'll have to pry my autotracking Nerf ADS off it's hot, charred turret
mount. Better tell Aperture Science as well.

On a more serious note, where do unmanned turrets and gun platforms fall?
They're not making the kill/no kill decision (they're always in kill mode) and
don't have a human directly at the trigger.

~~~
samwhiteUK
I'd imagine they'd be fine. They're not deciding. as you say

------
ada1981
“What’s the point of robotics then?”

------
manifestsilence
A lot of people on here are talking about radio control drones, and there's a
huge difference between unmanned and autonomous, both in terms of difficulty
of execution (exception: landmines) and moral responsibility.

A radio control drone bomb is equivalent to a gun with a really long range. An
autonomous drone bomb is equivalent to a gun with a motion detector. It's not
about the power of the weapon, it's about the relinquishing of human control
and accountability. This is why landmines are so evil, and why they are also
trying to ban lethal autonomous robots. It's not about capability of killing
at a distance, it's about who pulls the trigger.

------
jacquesm
A strict reading would include self driving cars.

~~~
superkuh
It doesn't even have to be strict. Driving vehicles is the most dangerous and
lethal thing most people do regularly. Software is the most unreliable system
most people interact with regularly. Autonomous cars will be lethal and will
cause deaths. It's crazy to me that they are being allowed to drive on public
roads in such a haphazard fashion. There's no way these cars would pass a
driving test.

Additionally, in a quarter of the USA for a quarter of the year driving
conditions are such (permanent snow cover) that there are _no_ visual
indicators of where to drive or park. Cars would need all roads pre-mapped
with ground penetrating radar at the least (a giant data set). And even with
that it would be difficult. I know this because it's difficult for me as a
human and I often have to rely on my contextual knowledge of how human society
works and how people behave to know where to drive on a surface, avoid getting
in crashes, and figure out how and where to park. None of this is possible for
current or near term computers.

------
short_sells_poo
I think it is too late. The genie is out of the bottle and there is no way of
putting it back now. The tech is not some obscure, super complex thing where
the expertise can be gated and it doesn't require controllable materials or
preproduction technology.

I mean, the sort of tech they are warning against can be put together by lone
operatives using commonly accessible electronics and concentional arms.

Why not call for banning mines or cluster munitions? Those remain functional
for decades after deployment and kill/maim indiscriminately.

I'm not trying to appeal to whataboutism here, I'm trying to point out the
futility.

Perhaps I'm too jaded, but I can't see any scenario where military robotics
can be prevented.

~~~
Gustomaximus
It doesn't matter if the genie is outta he bottle. Bio weapons were well out
of the bottle when countries largely agreed to put that away.

I think the problem is main countries won't see the massive deaths as a
problem for them but an assets and the race is on to develope the best.

To stop this essentially we need to get US, China and Russia to all agree to
ban these and the world will mostly follow..but in todays political
environment I can take see that happening.

~~~
Mirioron
Bioweapons aren't effective weapons of war though.

------
imtringued
I wonder if there was similar opposition to the first autonomous weapon before
it was deployed. I'm talking about the humble landmine.

~~~
dTal
I would suggest that the first autonomous weapon was likely to be animal in
nature - like Ceaser's "dog's of war". Train dogs to be as vicious as you can,
and let them loose on the battlefield.

I don't think "opposition" in the sense of an organized political movement was
as much of a feature of life back then. You did whatever you could to win, and
there was no "international community" to tut at you for it.

~~~
Mirioron
I think a hole in the ground is an even older "autonomous weapon".

------
sixstringtheory
This brings to mind a microwave beam developed by Raygheon to down drones [0]
and an article I read well over a decade ago about EMPs. Wonder how these are
doing on the consumer front?

[0]
[https://m.youtube.com/watch?v=hlmf032NmHU](https://m.youtube.com/watch?v=hlmf032NmHU)

------
OnlineCourage
How is an ICBM not already a lethal, autonomous robot? I think they may mean
affordable lethal autonomous robots.

~~~
shusson
I think they don't want algorithms to make high level ethical decisions like
whether to fire the ICBM. It's a thin line though between firing and correctly
navigating to a target.

------
ajuc
We've had lethal robots for decades. They were called rockets.

The only thing that changed is - now everybody can make it's own lethal robot.

I don't see why guns/explosives on robots should be treated differently than
explosives/guns without robots.

~~~
Gustomaximus
Not really. A person fired that at a particular target. Those sci-fi bullets
with find that improve aim are not leathal autonomous robots. It's still
manual, just with aim assist.

~~~
ajuc
Modern rockets can automatically change target after human fired it.

There are even anti-tank rockets you fire when you don't even see if there is
a target, just suspect it. Then, when it flies over a hill and sees a target -
it locks on and destroys it.

There are also bombs which spread small explosives over a large area, each of
these small explosives has passive control surfaces and try to target the
closest armored vehicle under them.

So that you only decide to deploy the bomb in some area, and the bombs decide
which target to choose.

------
fouc
Gun control. Simple. Don't worry about the robots, worry about the guns.

~~~
dqpb
How about a ban on all weapons? If people want to go to war, they can fight
with their bare hands.

~~~
whatshisface
The incentive to break that rule would be overwhelming.

~~~
marviel
Reminds me of Scot Aaronson's Malthusianisms:
[https://www.scottaaronson.com/blog/?p=418](https://www.scottaaronson.com/blog/?p=418)

------
prepend
The problem with autonomous robots is that there’s never cold dead hands to
pry them out of. Both a robots hands are always cold and dead and they’re
autonomous so no hands at all.

~~~
mysterydip
Would that make robots the true zombie apocalypse?

------
sschueller
Chemical weapons are banned. The Unites States and others still make them.
There is no way to enforce such a rule as nation states do not abide by them.

~~~
barrkel
That's needlessly reductive. Chemical weapons are banned, but Saddam didn't
use them against US troops in the US invasion.

Bans are probabilistic, not binary. Anyone violating a ban must weigh the
advantage from doing so against the consequences and probability of getting
caught. A high degree of revulsion and a consistent track record of
enforcement greatly increase consequences and probabilities, and make a ban
more effective. Just shrugging your shoulders and saying it's not always
possible is a sure way to increase the probability of unwanted action.

------
ThomPete
Drones are potentially lethal and autonomous, not sure how on earth that would
be done.

------
matthewfelgate
The Chinese and Russians will do it whether you ban it or not.

------
MrQuincle
Also ban lethal, autonomous humans, colloquially called soldiers, psychopaths,
or just the neighbor. Is this the way we treat our artificial children?

~~~
shusson
well we sort of do with prisons and death penalties.

------
Balgair
The Harvard Classics [0] (or Dr. Eliot's Five Foot Shelf[1]), is a 51-volume
anthology of 'the classics' compiled by Harvard president Charles W. Eliot and
published in 1909. It is _the_ WASP collection of literature and gives a very
interesting eye towards the first Gilded Age and into the minds of US leaders
during the first half of the 20th century.

If it were to be updated, nearly 110 years later, many works would be left out
(like, nearly all of the middle French plays) and MUCH would be included.
Without a doubt, one of the books to be included would be _The Making of the
Atomic Bomb_ by Richard Rhodes, winner of the 1987 Pulitzer[2]. This book goes
from start to finish on _how_ the A-bomb was made, the major and minor
players, and their thoughts about the bomb all along the way. Rhodes takes us
into the heads of the physicists and clears out all the calculus and gets to
the human parts of the endeavor.

All the players in the race for the nuke knew the bomb was unavoidable for
mankind. Leo Szilard was the first person to really conceive of the bomb while
waiting for the stoplight to change where Southampton Row passes Russell
Square, across from the British Museum in Bloomsbury[3].

There _were_ thoughts, groups, and people that entertained the thought that
they could form a group of 'priest-scientists' that would keep the uranium
safe from politicians and warlords. Keep the power flowing but stop the chain
reaction from going critical and taking out _billions_ of people. But as the
race progressed, all the real players for the nuke _knew_ that such a utopia
was impossible. That this particular genie was out of the bottle the second
that Leo, or any other physicist, had made it across Southampton Row.

These robots are a microcosm of the same issues that the atomic physicists
faced nearly 100 years ago. They know the damage that the ideas will have upon
us all, they know that the semi-bucolic world we now live in will fall away to
violence, and they know that they can't stop it. But, bless them, they are
trying to sound the alarms and maybe, just maybe, change the minds of some of
the politicians and warlords.

Looking back at the first Gilded age, with all their knowledge, they had no
templates for the power of The Bomb and finally facing the mortality of all
humans. Fortunately, we do have the templates, at least in terms of the use of
lethal robots. We know how this plays out, we know how to make an atomic bomb,
and we programmers need to learn from the physicists of the '30s and '40s.

[0]
[https://www.myharvardclassics.com/categories/20120212](https://www.myharvardclassics.com/categories/20120212)

[1]
[https://en.wikipedia.org/wiki/Harvard_Classics](https://en.wikipedia.org/wiki/Harvard_Classics)

[2]
[https://en.wikipedia.org/wiki/The_Making_of_the_Atomic_Bomb](https://en.wikipedia.org/wiki/The_Making_of_the_Atomic_Bomb)

[3]
[https://openlibrary.org/works/OL2617750W/The_making_of_the_a...](https://openlibrary.org/works/OL2617750W/The_making_of_the_atomic_bomb)

------
jillesvangurp
IMHO delaying the inevitable is not a plan. We need to find a way to survive
the inevitable use and deployment of technology by others. This stuff is no
longer rocket science. Any idiot with a raspberry pi and some basic tooling
can build some quite effective weaponry. Most of the AI you need for this is
OSS and getting easier to use by the month. This is not at all like nuclear
warfare where you need a lot of skills, knowledge, infrastructure, and capital
expenses to be able to put a weapon together. All this stuff will take is a
bit of ingenuity and access to commodity hardware and software.

IMHO hoping that others won't go there is not a plan because ultimately
somebody will. We need credible defensive capability against this stuff. Given
the adversary is going to be autonomous AIs with lightning fast responses,
having humans in the loop when defending means being defenseless effectively.

Most current wars are asymmetric guerrilla wars where there is a powerful but
reluctant to engage party and some highly motivated individuals fighting an
unwinnable fight. E.g. the conflicts in the middle east are basically premised
on men with primitive weaponry moving around the country trying to stay hidden
from air surveillance, satellites, or simply hiding among civilians. Countries
like the US are reluctant to go in and fight on the ground because things get
ugly in terms of casualties and 'collateral' damage. You can bet drones will
be popular on both sides in such conflicts.

Drone based warfare would make that a lot more one sided than it already is
and would probably make this type of warfare a lot less attractive and
potentially put an end to a lot of long running conflicts. This does not have
to turn into the cliche dystopian mess. For that look no further than the
current state of affairs in e.g. Congo, Yemen, Afghanistan, or Syria.

War is not about chivalry but about winning at any cost. When one side is
fighting with their hands behind their backs thus preventing them from winning
and the other side can't win, you have a stalemate. It used to be the case
that the winner would execute/enslave surviving enemies. Brutal but effective.
When the war was over, there was little chance of a comeback by the other
side. These days we do a lot of damage to each other but it is rarely
decisive. WWII was one of the last conflicts where the other side literally
had no choice but to surrender unconditionally. I live in Berlin. The effects
of that war are very visible still. It's also a very peaceful city these days.
That war was really over when it was over.

Wars between nation states are subject to all sorts of international rules.
The problem is that most wars these days do not necessarily involve nation
states and nation states instead engage each other via proxy wars. E.g. the
US, Iran, and Russia are not formally at war but actively engaging each other
in a plausibly deniable way nevertheless. These wars are dirty, brutal, and
cause a lot of misery precisely because they rarely are fought to a conclusion
and fought using any means possible.

------
liberabaci
What happened to Asimov's Three Laws of Robotics?

~~~
krapp
Asimov's Three Laws were a plot device, not an actual attempt to codify ethics
for artificial intelligence. They were never intended to be taken seriously
outside of their fictional universe.

Their purpose was to set up conflict through an apparent paradox (robots can't
harm humans but then robots _harm humans_ ) by showing the Laws _failing
spectacularly_ and exposing humanity's hubris in the face of unintended
consequences.

Here's a big, long HN thread about it[0].

[0][https://news.ycombinator.com/item?id=19044387](https://news.ycombinator.com/item?id=19044387)

------
adriveatrain
Why is "scientists" the name of the group?

That seems to be an appeal to authority. This group is making an ethical
argument, not a scientific one.

Which probably why without “supervision or meaningful human control” is their
criteria.

Is a security guard with 20 camera controlled turrets using facial recognition
to choose targets "supervised" ?

~~~
dalbasal
Scientists, as a group, _are_ a moral authority. They're not the only
authority, but they represent a generally respected tribe of intellectuals.

The title doesn't say "science calls for."

I know I'm interested in hearing what a broad group of scientists think about
non scientific questions.

------
no1youknowz
We need robots for three reasons:

First, special forces are having a hard time recruiting and the reality, it's
not just them [0].

Secondly, why not purely for defensive purposes? Want to eliminate the
effectiveness of nuclear weapons? If we can have self landing rockets. Why not
a missile that breaks into multiple rockets in space and destroys all nukes
whilst they are inbound?

Thirdly, why should billionaires and those who can afford them have their own
private security? Imagine having your own inside the home? 24 hour protection.
Only calling the Police to effectively clean up the situation? Could be a new
startup right there.

Finally, we should be encouraging darpa and the military to be spending
billions on robots. That technology will filter down eventually to the
consumer. That's better for all of us.

[0]:
[https://www.youtube.com/watch?v=4OFevGcLHTU](https://www.youtube.com/watch?v=4OFevGcLHTU)

~~~
black_puppydog
> why should billionaires and those who can afford them have their own private
> security?

Has it ever occurred to you that banning killer robots _not_ just for the poor
and the public sector might be on the table?

