
Killer robots: Experts warn of 'third revolution in warfare' - pmoriarty
http://www.bbc.co.uk/news/technology-40995835
======
cbanek
I feel like I have to mention the story of the gatling gun.

"Gatling wrote that he created it to reduce the size of armies and so reduce
the number of deaths by combat and disease, and to show how futile war is."

[https://en.wikipedia.org/wiki/Gatling_gun](https://en.wikipedia.org/wiki/Gatling_gun)

And yet, by making one person more powerful, I'm not sure that we've reduced
the size of armies (certainly not in terms of support and logistics), but we
have made it possible for one random person to do a lot of damage.

Theoretically, by only having robots fight each other, we could eliminate
human warfare and human losses. Morality would be some kind of cold equation
that could absolve people of guilt. In reality, this would just never happen.
People are the source of conflict, and the source of killer robots.

I, for one, welcome the Butlerian Jihad.

[https://en.wikipedia.org/wiki/Butlerian_Jihad](https://en.wikipedia.org/wiki/Butlerian_Jihad)

~~~
slackingoff2017
If war can be waged without loss of life the biggest incentive to avoid war is
gone.

We have seen countries start attacking each other on the internet in peace
time because it's hard to prove and the costs are abstract. I fear the same
will be true for war with robots.

~~~
abhi3
>If war can be waged without loss of life the biggest incentive to avoid war
is gone.

There will be a loss of life in any war. Note that they are called 'Killer'
robots. It's just that the loss of life will be of noncombatants.

My concern is not countries waging wars against each other because that can
always be deterred, but instead governments using it on its own people. Right
now authoritarian governments need to worry about soldiers rebelling but
computer code powering the robots isn't gonna rebel when you ask it to kill a
city square full of protesters.

------
Pinckney
>It is a commonplace that the history of civilisation is largely the history
of weapons. In particular, the connection between the discovery of gunpowder
and the overthrow of feudalism by the bourgeoisie has been pointed out over
and over again. And though I have no doubt exceptions can be brought forward,
I think the following rule would be found generally true: that ages in which
the dominant weapon is expensive or difficult to make will tend to be ages of
despotism, whereas when the dominant weapon is cheap and simple, the common
people have a chance. Thus, for example, tanks, battleships and bombing planes
are inherently tyrannical weapons, while rifles, muskets, long-bows and hand-
grenades are inherently democratic weapons. A complex weapon makes the strong
stronger, while a simple weapon — so long as there is no answer to it — gives
claws to the weak.

>The great age of democracy and of national self-determination was the age of
the musket and the rifle. After the invention of the flintlock, and before the
invention of the percussion cap, the musket was a fairly efficient weapon, and
at the same time so simple that it could be produced almost anywhere. Its
combination of qualities made possible the success of the American and French
revolutions, and made a popular insurrection a more serious business than it
could be in our own day. After the musket came the breech-loading rifle. This
was a comparatively complex thing, but it could still be produced in scores of
countries, and it was cheap, easily smuggled and economical of ammunition.
Even the most backward nation could always get hold of rifles from one source
or another, so that Boers, Bulgars, Abyssinians, Moroccans — even Tibetans —
could put up a fight for their independence, sometimes with success. But
thereafter every development in military technique has favoured the State as
against the individual, and the industrialised country as against the backward
one. There are fewer and fewer foci of power. Already, in 1939, there were
only five states capable of waging war on the grand scale, and now there are
only three — ultimately, perhaps, only two.

George Orwell, You and the Atomic Bomb.

[http://orwell.ru/library/articles/ABomb/english/e_abomb](http://orwell.ru/library/articles/ABomb/english/e_abomb)

~~~
visarga
I think it won't be long until we're going to have a self replicating factory
- a self replicator, like a living thing, capable of making copies of itself
and producing anything we need. Is that even possible? I believe so - a
combination of robotic assembly and 3d printing, when coupled with extensive
industrial design libraries would do the trick. We could "compile" any design
into physical form, even a replica of the factory itself.

When it exists, then we can produce for cost of materials and raw energy any
amount of war robots. Even a single replicator can bootstrap an army. Then war
becomes democratic again. /s

What I want to say is that even the current advantage of the superpowers is
temporary. Self replicating factories will make economy a thing of the past.
We just need one of those in open source. Software itself is already a self
replicating technology.

Even hardware is becoming more and more accessible. Drones, Raspberry Pi's,
sensors - they are converging towards cheap easy integration. That means the
automation field is opening up, the entry barrier going down.

~~~
lazyjones
> _I think soon enough we 're going to have a self replicating factory_

You can build that factory, but it can't produce without resources - and those
are still gathered and transported in a very inefficient, low-tech way. It's
an interesting problem to solve.

~~~
AstralStorm
Moreover those resources that are cheap to access are already in hands of
major players. You either get to find new ones or new places that haven't been
accessed. (In space?)

------
samcheng
How can the article mention the 'third revolution' without describing the
first two?

What are they, gunpowder and nuclear weapons? Airpower? Bronze armor? The
professional army? IEDs?

~~~
LogicalBorg
Your first guess was correct. "...autonomous weapons have been described as
the third revolution in warfare, after gunpowder and nuclear arms."
([https://futureoflife.org/open-letter-autonomous-
weapons](https://futureoflife.org/open-letter-autonomous-weapons))

------
throw22817
How does this square with US drone fleets? And drone fleets of other
countries?

They are controlled by humans now but automatic control is likely in
development.

In another post now:
[https://news.ycombinator.com/item?id=15069178](https://news.ycombinator.com/item?id=15069178)

This is already a bit late. Hope we can find a way to enforce the ban if it
passes. Drones are a lot easier to develop in secret than nuclear weapons.
These countries already have weaponized drones: the U.S., the U.K., China,
Israel, Pakistan, Iran, Iraq, Nigeria, Somalia, and South Africa.

~~~
skybrian
The idea is to ban weapons that don't have a human in the loop. As you say,
hard to do. Particularly for remotely controlled weapons where it's just a
matter of software.

~~~
fapjacks
Particularly when it's been demonstrated that human beings will just rubber-
stamp whatever the machine gives them.

------
georgeecollins
The US Air Force is in denial about this. The controls of a modern craft are
mediated by computer through fly by wire. The mechanism that detect aircraft
is mediated by computer. The targeting of guns and missiles all depend on
computers, not humans. In the latest aircraft, the F-35, the obscuring of
vision is compensated for cameras that present the pilot with a
computationally created augmented reality view of the environment. The plane
is programmed to keep flying if the pilot passes out from hi-Gs. Etc.

Even if a human pilot were really somehow better (better judgement maybe, or
harder to fool in theory), that advantage will not outweigh the fact that a
pilot-less plane is cheaper, more maneuverable, able to react faster and be
expendable.

Submersibles are like planes in that making them manned adds incredibly to the
cost, and human senses are not very useful.

In those areas, humans are going to leave the loop by necessity.

------
imh
There's an interesting point lower in the thread. Is a landmine an autonomous
weapon? They've been pretty awful for humanity.

~~~
robotresearcher
Yes, which is why their use has been debated and controlled by treaty for
decades.

------
relyks
Their description of a "killer robot" ["A killer robot is a fully autonomous
weapon that can select and engage targets without human intervention."] sounds
awfully similar to terminators from the film franchise. Seems like it's only a
matter of time before something like SKY NET comes online and films like
Terminator and the Matrix become the new dystopian reality. :\

Maybe the UN can create something similar to the Universal Declaration of
Human Rights, but effectuate Isaac Asimov's "Three Laws of Robotics"
([https://www.auburn.edu/~vestmon/robotics.html](https://www.auburn.edu/~vestmon/robotics.html))

Computers (by their current architecture) don't have the ability to become
conscious and self-aware, so the world doesn't have to worry about robots
_consciously_ choosing to eliminate us or conquer society anytime soon

~~~
IanCal
> Maybe the UN can create something similar to the Universal Declaration of
> Human Rights, but effectuate Isaac Asimov's "Three Laws of Robotics"
> ([https://www.auburn.edu/~vestmon/robotics.html](https://www.auburn.edu/~vestmon/robotics.html))

There's a story around those three laws that's quite important.

> Computers (by their current architecture) don't have the ability to become
> conscious and self-aware

I see absolutely no reason to think this is the case.

------
mc32
The big problem is proliferation as these new war bots become cheaper.

China, RU, the US and the EU won't go around doing wanton destruction just
because. They will have predictable calculus behind their decisions (for
example, neither has take the opportunity to take out the NK leader, while any
one of them could remotely with little repercussion --aside form some
international grandstanding)

On the other hand, you get this in the hands of dictators, such as the
aforementioned, or Maduro or Castro, or al Baghdadi and who knows what they
would unleash internally or against regional rivals.

For that reason, I'd support a complete and enforced ban. With the possible
exception that we might battle extra terrestrial aliens if they are the ungood
kind.

~~~
0xbear
Ban only works if you agree to it. The technology that does not require rare
radioactive isotopes tends to trickle down really quickly. Today's
unachievable technical capability will cost five dollars 50 years from now.
I'm not sure if any of this can be banned per se. The solution seems to be to
have robots that are better by an order of magnitude.

~~~
mc32
To clarify I mean ban and enforce the ban (enforced by CN, RU, EU and US). Non
compliance results in severe economic penalties/blockades. Use the UN in all
possible ways. The big-4 might agree to something like this lest they repeat
the nuke proliferation problem.

~~~
0xbear
Just don't expect it to be anywhere near as effective as nuclear
nonproliferation treaties. It's mostly software, and any idiot can copy
software.

------
stretchwithme
An enemy doesn't need its robots to kill you to defeat you. It just needs to
disable your infrastructure and break all your weapons, including any killer
robots you have. All the taxpayers can keep living. The new rules will be in
the email.

~~~
lazyjones
> _All the taxpayers can keep living. The new rules will be in the email._

Why would you need taxpayers when you can just take anything you want,
including everything those taxpayers would consume?

You would be redundant - a worse-controllable, unreliable, technically
inferior and more costly minion. So yes, you would be killed, unless needed
for organ harvesting or pleasure.

~~~
tzakrajs
What if the rules of the killer robots are benevolent? Maybe we have a day in
the shade.

~~~
lazyjones
> _What if the rules of the killer robots are benevolent? Maybe_

The owner of those robots would likely lose in a war against someone with more
efficient robots, so the outcome would be the same in the long run.

------
ldp01
As someone who lives in the Oceanic region, reading this, my thoughts/fears
jump to how this will affect the balance of power between the US and China,
which I believe helps ASEAN and Aus/NZ to live in a state of relative self-
determination at the moment.

Are armies of cheap robots likely to shift the balance of power in favour of
the large established super-powers or smaller nations? Or would they have an
effect such as rendering the US Pacific Fleet redundant? It will be
interesting and scary to watch it all play out.

------
Animats
A very likely development is swarms of small drones that kill anybody with a
gun.

~~~
nradov
It doesn't even have to be a swarm. Defense contractors are pretty close to
being able to build the hardware for a "Terminator". Not a bipedal humanoid
robot, but a small unmanned combat ground vehicle equipped with millimeter
wave radar, optical, and IR sensors. Use pattern recognition software to
detect anyone carrying something that looks like a weapon or bomb and put a
bullet into him with total accuracy. Mass produce them and station one on
every street corner in a conflict zone.

I'm not particularly looking forward to that future but the technological
trends are inevitable and won't be stopped by any UN treaty.

~~~
Animats
One contractor in Israel already has an 8-rotor small drone with a machine
gun. A teenager in Connecticut built a 4-rotor drone with a handgun and made a
video. (Recoil pushes it backwards about a foot when it fires, but it remains
level.) On the hardware front, we're there.

------
DrJaws
A robot war could end with the losing country just going for a nuclear EMP,
that would shut down not only his country but a whole continent.

I agree that the sooner we put laws on what we can create with true IA's the
better.

------
crmd
It is an unequivocal war crime to create a computer program that autonomously
kills humans.

~~~
nine_k
How about autonomous guided weapons? A cruise missile, having been given a
target, autonomously chooses the optimal path to it. The level of autonomy of
this kind will only grow.

A weapon that may elect to kill a particular human but not another looks to me
like an improvement over a weapon that just kills everyone around, as a
typical bomb does.

~~~
randcraw
I think the point is, only a human should: 1) authorize the kill, 2) choose
the target, and 3) specify the limit of acceptable collateral damage. The
missile's AI should never decide 1, 2, or 3, but only _how_ to execute its
orders in compliance with the standing rules of engagement.

