
Slaughterbots: Stop Autonomous Weapons [video] - georgecmu
https://www.youtube.com/watch?v=9CO6M2HsoIA&t=
======
narrator
It's like WWI all over again. In WWI, people went happily off to war thinking
it's going to be nice and heroic and adventurous like the good old days and
they all get slaughtered in a pathetic pointless way in a blink of an eye by
newly developed automated war machinery.

The first AI War is going to be a really brutal wakeup call. The part in sci-
fi where anyone actually has to aim to kill an enemy was a big romantic lie.
The robots don't miss and they draw and shoot before the humans even know
they're there. William Gibson, to his credit, did anticipate this with the
Slamhound assassin robots in Count Zero.

~~~
YeGoblynQueenne
"Heroic" and "adventurous" OK, but- "nice"? I can't believe anyone ever
thought that going to war -to kill and risk getting killed- could be "nice",
except perhaps for murderous psychopaths.

~~~
jblazevic
It's a figure of speech meant to convey irony.

------
XorNot
The message of this film feels extremely confused - probably because it's
trying to single-mindedly push an agenda rather then explore the problem
space. In a world where apparently drone-swarms are this cheap...what, no one
ever developed counter-drones? It's the reason grey goo isn't a realistic
scenario for nanotechnology: the goo has to replicate, whereas all the
counter-goo has to do is kill it.

Same issue: a world where the technology to build an incredibly cheap drone
that size which can hunt humans with facial recognition based on specific
parameters exists...is also a world in which an anti-drone can use the exact
same technology and just ram into unidentified drone vehicles to disable them.
If the swarms are cheap enough for terrorists, then the counter-swarms are
monumentally cheaper.

EDIT: It's worth noting you could argue other scenarios - like only
governments building these, but that's what I mean by the message being
confused - it's not clear what the threat is supposed to be.

~~~
soVeryTired
I don't think "people will invent countermeasures" is a particularly good
argument. We have countermeasures against chemical weapons - but not everyone
has them all the time. We would still be better off if we hadn't opened that
pandora's box.

For an eight minute video, it does a decent job of exploring the consequences
of a weapon that can kill remotely, with minimal collateral damage, and with
no supervision. They touch on potential use against terrorists, political
opponents, and activists. The main point being that once you release thing
into the wild, you have little control over how it will be used.

~~~
folknor
> We would still be better off if we hadn't opened that pandora's box.

I don't think that's a reasonable assumption to make. I have no evidence
either way, but I can easily instantly theorize that lots of medical
advancements might have come from the same people that got their feet wet in
chemical weapon technology, in later stages of their careers/life.

Again, I don't really have an opinion, and haven't really thought it through -
I'm just saying your blanket statement needs a citation.

------
neuro_imager
Unfortunately I don't see a way this can be avoided.There has never been a
weapon humanity imagined, designed, and built that it has not used. AI will be
no different.

A ban would be impossible to implement. What exactly would you ban? AI?
Drones? facial recognition? Once you have a certain level of capability in
these technologies, weaponising these things is the easy part.

~~~
jrochkind1
We can make it considered unacceptable and against international law. Like
chemical weapons. Which still happen, but not as much as they would if the U.S
were, say, planning on using them all over the place, doing heavy R&D into
them, and selling them to other people. Like, you know, robot kill bots.

~~~
mikeash
Most of the reason chemical weapons don't get used is because they're not very
effective. It's hard to disperse them on the battlefield, changing weather has
a nasty habit of blowing the stuff back onto your own troops, and protective
gear is cheap enough to equip everyone in an army. I'm pretty sure that
everyone was willing to ban them internationally not just because they're
horrible but also because they're not very useful.

Compare with nuclear weapons, which are way worse and have a lot of support
for a ban, but there's no serious hope of that ever happening, because they
are really good at killing people and destroying stuff.

Seems to me that a ban is only realistic if the stuff doesn't work well.

~~~
YeGoblynQueenne
>> Most of the reason chemical weapons don't get used is because they're not
very effective.

Against a modern military equipped and prepared to withstand them, perhaps.
Against a civilian population or a less well-funded force, not so much. This
is doubly the case when the attacker is a non-state agent (like a terrorist).
Remember for instance the Sarin gas attack in the Tokyo metro by the Shinning
Path [1]. An air force bombing a major city with chemical weapons would also
cause vast casualties.

I think what you're trying to say is that chemical weapons are primarily
designed to be terrifying with lethality considerations coming next. However,
you could argue the same about nuclear weapons. After all, the attacks in
Hiroshima and Nagasaki were not acutally meant to wipe the Japanese apart,
only to scare them into submission. And nuclear deterrence works on the
principle that an enemy _won 't dare attack_ a nuclear power.

In general, you could think of all weapons ever as aimed primarily at the
morale of the enemy rather than their life and that thought may have some
merit. After all, an army that breaks and runs is beaten faster and with fewer
losses to the other side than one that fights to the last [2].

However, the important point to keep in mind is that weapons, even when
they're primarily designed to be terrifying, are designed to be terrifying _in
the manner they kill_. So I think you'll find that most people would agree
that chemical weapons should be banned because they kill in a horrible manner,
not relative to the number of casualties they may or may not inflict.

I do think you underestimate their potency btw.

__________________________

[1]
[https://en.wikipedia.org/wiki/Tokyo_subway_sarin_attack](https://en.wikipedia.org/wiki/Tokyo_subway_sarin_attack)

[2] Cough cough. In war games, certainly.

------
jrochkind1
I for real find it terrifying. If we were sane, we'd be pushing for
international treaties against this same as chemical weapons, instead we're
making it happen and selling the tech to people.

~~~
int_19h
International treaties around weapons are hard to enact lately - it seems that
most of the useful stuff that we have is mostly created in WW1-WW2 era. For
more recent treaties, the usual pattern is that one is enacted, but US, Russia
and China (and sometimes a few others who happen to be major users) simply
refuse to be a party. Most other countries sign up, but it mostly doesn't
matter, because they either don't do it on a large scale anyway for other
reasons, or they ignore it when it's suddenly convenient. Examples:

[https://en.wikipedia.org/wiki/Ottawa_Treaty](https://en.wikipedia.org/wiki/Ottawa_Treaty)

[https://en.wikipedia.org/wiki/Convention_on_Cluster_Munition...](https://en.wikipedia.org/wiki/Convention_on_Cluster_Munitions)

------
kirykl
Software eating society. Suddenly our entire criminal justice paradigm breaks
down without corporeal law breakers. Makes the 'corporation as a person'
question seem sophomoric

------
l0b0
Just the availability of this kind of tech would scare anybody sane shitless.
From that day, you'd better hope you don't ever piss off anybody within a 1000
km radius enough to find your photo and address online (easy in most
countries) and drive to within drone distance of your house. Want to
obliterate anyone's family completely? Done. Kill anyone in session at the
government of a country - any country? Done.

~~~
anonymous5133
That technology already exists and is being tested in a military setting
[https://www.youtube.com/watch?v=CGAk5gRD-t0](https://www.youtube.com/watch?v=CGAk5gRD-t0).
Not sure if those ones have explosive on it but I'm sure it isn't harder to
implement it.

------
nopinsight
People who can should apply to work on one of the AI Safety Research projects:

[https://futureoflife.org/ai-safety-research/](https://futureoflife.org/ai-
safety-research/)

The list covers a number of professors in top graduate programs and several
angles should be tackled simultaneously to maximize our chances.

~~~
Pinckney
This seems to be mostly about skynet scenarios, which is an entirely different
problem.

~~~
nopinsight
Yes, I am aware. But weaponized drones and robots will give much more power to
a future skynet or a lesser version that autonomously acts with minimal
guidance which may unintentionally divert from human goals. This greatly
increases the priority of AI Safety research.

Some people I know kept arguing we can simply unplug the skynet and be mostly
safe (despite some damage). These AI weapons make it clear that we will not be
able to largely limit the damage to the virtual world.

~~~
rl3
> _But weaponized drones and robots will give much more power to a future
> skynet ..._

Nah, not really. In the face of a proper hard-takeoff superintelligence,
Skynet will look positively incompetent by comparison.

There's enough manufacturing equipment connected to the internet today that a
superintelligence could physically manifest whatever it wishes. That's not to
mention all the human labor it could coerce, trick or pay.

Put simply: as soon as the thing copied itself into the internet, that's game
over. Better hope it likes us.

------
speedplane
Best and most telling part of the video: "trust me, these were all bad guys"

------
posnet
Daniel Suarez, the author of Daemon, has another book called Kill Decision,
which explores a lot of these ideas.

~~~
mtgx
He's also had a great video on this topic for a few years:

[https://www.youtube.com/watch?v=pMYYx_im5QI](https://www.youtube.com/watch?v=pMYYx_im5QI)

------
hitekker
Terrifying and fascinating, as all good wake-up calls should be.

In the future, miniature exploding drones could indeed autonomously,
distinguish between cranium and body, plot a flight route and divebomb at
their victim.

But, I don't think these skull-crackers will be _fully_ autonomous, even with
significant advances and development. Given the chance of collateral damage
and the "fuzziness" of algorithms, the kill command or sequence will be still
left to a human operator, like how America does so in the skies of Pakistan or
with the machine-gun robots of the Middle-East[1].

In my mind, the the computer would display potential targets via live capture,
and the human in the chair would cycle through them, and after filtering out
the mistakes (a human dummy, a picture on the wall), type "y".

Militaries would have this check less out of "consider the ethics and
morality!" and more out of rational avoid-friendly-fire and ensure-maximum
accuracy-of-the-payload.

One exception to this approach could be indiscriminate killing; where the
designers intentionally program the drones to lobotomize any target within a
geographic range or a time, as long as that target exceeds a certain
threshold. This approach is guaranteed to result in a lot of false positives,
but would still be remarkably efficient for the purpose of terror or localized
warfare. [2]

On a side note, YouTube recommended a follow up video, depicting a real-life
swarm, howling and circling around a target.
[https://www.youtube.com/watch?v=PYLP0pAGbE0](https://www.youtube.com/watch?v=PYLP0pAGbE0)

[1]: [https://en.wikipedia.org/wiki/Foster-
Miller_TALON](https://en.wikipedia.org/wiki/Foster-Miller_TALON)

[2]: Even remote controlled drones strikes are pretty iffy on who they kill,
so it's not like unlikely that less savory actors won't mind blanketing swaths
of countries with skull-crackers.
[https://www.nytimes.com/2015/04/24/world/asia/drone-
strikes-...](https://www.nytimes.com/2015/04/24/world/asia/drone-strikes-
reveal-uncomfortable-truth-us-is-often-unsure-about-who-will-die.html)

~~~
mtgx
I think it's naive to think their primary focus will be collateral damage. Is
that really the focus in _any war at all_?

[https://www.vox.com/world/2017/11/16/16666628/iraq-nyt-
casua...](https://www.vox.com/world/2017/11/16/16666628/iraq-nyt-casualties-
civilian)

And that's just the targeting of the drones themselves. But how accurate is
the intelligence targeting? Post-Snowden we found out that they were killing
people mainly based on what SIM card the targets carried with them.

Plus, they consider _any male that is killed and above 16 years old_ as a
terrorist. It's harder not to be "accurate" when that's your definition of a
target...

Also, you're forgetting one aspect of this. The cheaper the tech becomes, the
easier it will be used. Just like we now kill 100x more people with drone
strikes that we did with airstrikes, in the future we may be kill 100-1,000x
more people with automated drones, because it's so much easier to "get rid of
the baddies".

Do you actually think they'll hire 100x more people to operate those machines
and "have final say"? Yeah, no way that's how they'll think about it - _unless
we make them_ think differently about it and not to allow the automated kill
machines to ever be built/sent.

I mean, given the type of politicians the U.S. government tends to have these
days, which option do you think they'll choose? A 10-year dragged out war
against a group like ISIS, or "just sending 10,000 automated drones" into
multiple "hot areas" and "finish the job within a month"?

Can you not see how their logic would go, and that we'll actually need a lot
of people to oppose that type of thinking to ensure it will not happen? But
will there be a lot of people to ensure that doesn't happen, if say a
terrorist group takes out the White House? Or will everyone think "those
100,000 people all killed automatically by the drones _deserved it_ for
destroying our White House!!".

Heck, Trump already asked the military "for the biggest bomb they can send" \-
a bomb that was built but _never used before_ \- and it wasn't even a
situation of the US being under huge immediate threat at the time. It was
mainly done for PR purposes for Trump to show how macho he is and how he "gets
things done." The worst part about it is that the media seemed gleeful about
it, rather than condemning him.

I think it would be best to ensure the machines are not built in the first
place. There are too many trigger-happy people that would send such drones
out.

~~~
builditand
Theoretically we can make building and using them illegal, but there's no way
to prevent building and using these machines.

------
p1xelHer0
I, too, look forward to season 4 of Black Mirror.

~~~
dag11
Interestingly, this was almost the exact plot to the the latest season finale.

~~~
tialaramex
I've observed elsewhere that the reason this is different is also the reason
they couldn't use footage from "Hated In The Nation" (the Black Mirror episode
you're thinking of)

Hated In The Nation hypothesises that a lone nutter repurposes existing non-
lethal technology to kill people. No explosives, just drill your way into the
brain through the ear because the robots are replacements for flying insects.

A law saying "Don't make killer robots" is useless in the Black Mirror
scenario, nobody made killer robots except the lone nutjob they're already
hunting for murder. So for this campaign it's an unhelpful message, it
suggests their campaign would be futile.

------
mnemotronic
Thanks for the requirements and use case. How about a Gofundme to start on
development? I also see a group providing drone attack services along the
lines of booter/stresser DDoS providers (distributed denial of services).

On the other hand, I think it will be a long time before visual face
recognition will ever get this good, especially out in the real world with bad
lighting, hats, eyeglasses etc. But let's say the world of the video authors
comes true. What do we do?

Defenses:

    
    
      1. Microwaves can fry electronics.  A system wouldn't need fancy targeting; just blast powerful microwaves in all directions.
        Downside: 
          A. Microwaves this strong are bad for people too.
          B. All it takes is 1 anti-microwave drone to slip through and disable the system.
          C. Asymmetrical cost of offense vs. defense
    
      2. Lasers can fry the optical sensors used by the drones
        Downside: 
          A. Probably would need a fancy targeting system?
          B. Same problems as microwave defense 1B and 1C.
    
      3. Something along the lines of WW2 barrage balloons.  A network of suspended lines designed to entangle the drones.
      The lines wouldn't have to be very big; a piece of strong thread could get caught in the rotor of a drone and disable it.
        Downside:
          A. Wind gusts
          B. How do you suspend the thread?
          C. Do you carry one with you when you go out to Starbucks?
          D. The thread would have to be small enough to not be detected by the drone's camera but big enough to cripple it.
    
      4. Ski mask.  If the drone is programmed to attack specific individuals using facial recognition, take away it's targeting requirement.
        Downside:
          A. Maybe drone deployers don't care about a specific target -- just body count.
    
      5. Play dead.  It's assumed the drones wouldn't waste resources on targets that have already been attacked, so just emulate a victim.
        Downside:
          A. Big assumption
    
      6. Wax museum defense.  Have a mannequin with target's facial features.  Drone attacks wax dummy.
        Downside:
          A. See 3C, 4A
    
      7. Obscure the target.  Throw a blanket over the target.
        Downside:
          A. Time to deploy.
          B. Darn.  Left my blanket in the back in the ....POP!

------
basicplus2
I find it Strange that nobody generally comments on this including all the
previous postings..

is it because people think it is inevitable and there is nothing they can do
to stop it?

or is it because people think it is fantasy and could never happen?

~~~
daxorid
Another option is that some of us really look forward to real-life sentry gun
turrets; we actually _want_ this future.

~~~
sorokod
Systems such as Phalanx CISW with fully automatic mode have been available for
ages.

~~~
gaius
_Systems such as Phalanx CISW with fully automatic mode have been available
for ages_

Phalanx doesn't work tho' :-) In 1991 Iraqi forces launched a Silkworm missile
against USS Missouri. The ship launched decoys... and its Phalanx guns
promptly targeted those decoys. Fortunately there was a Royal Navy ship, HMS
Gloucester, in the vicinity which dealt with the problem with a good British
weapon, the Sea Dart missile. This was the first real "drone on drone" combat,
if you will.

~~~
sorokod
I expect the software was upgraded several times since. Also, I expect drone
countermeasures coevolve with drones.

------
jstewartmobile
In a world of fighter-jets, satellites, and nukes, under the right
circumstances this is a development which could cut both ways.

From Wikipedia: " _Quigley concludes that the characteristics of weapons are
the main predictor of democracy.[0] Democracy tends to emerge only when the
best weapons available are easy for individuals to buy and use._ "

[0]
[http://www.carrollquigley.net/pdf/Weapons%20Systems%20and%20...](http://www.carrollquigley.net/pdf/Weapons%20Systems%20and%20Political%20Stability.pdf)

~~~
georgecmu
Do Somalia, Afghanistan, and Iraq qualify as natural experiments that test
this theory?

~~~
jstewartmobile
I doubt it. Jets, drones, and satellites vs peasants with automatics is a
fairly stark disparity.

I'm making more of a long-term guess here. Unlike uranium enrichment or jet
construction, I _suspect_ that state-of-the-art drone manufacturing may
_eventually_ reach a level of affordability to where drone-facilitated hits,
feuds, and drive-bys become a _thing_ on domestic soil.

edit: And key part of the wiki quote is " _the best weapons available_ ". If
the best weapon available is a six-shooter, and it has a price that almost
everyone can afford, you've got an "equalizer". Democracy! If the best weapon
available is a nuke, and it takes billions in supply chains and hundreds of
millions in materials and know-how to make one, that is a lot of extra
leverage for the guys at the top.

------
bawana
Cannot be stopped. Whatever human mind imagines can be made. Only defense is
counter drones. Everyone will buy one with next cell phone.

------
hans_mueller
technically there is nothing new in this clip. People have been singled out
and murdered for since we exist and sometimes nobody is singled out but
everybody killed with a bomb. nonetheless there is something undeniably creepy
and unsettling about being killed by a soulless machine.

------
free_everybody
This is terrifying!! AHH!!!!

------
EGreg
I have been warning about this for years:

[https://www.schneier.com/blog/archives/2015/08/shooting_down...](https://www.schneier.com/blog/archives/2015/08/shooting_down_d.html#c6702487)

------
mickduprez
personal EMP??

------
boznz
I thought it was real for a few seconds.. Wow Scary!

------
greggman
Is this a leak of Black Mirror Season 4?

------
alexS
Maybe we can petition.

~~~
petre
Or start building anti drone capabilities.

------
nathantross
crazy

------
dr0l3
Magnificent piece of technology! Except it gets defeated by a 2$ skimask. This
kind of fearmongering isn't really useful as a way of creating political
change. If you want people to participate in a reasonable and constructive
dialog then convince them to do so with reasonble and constructive arguments.

~~~
jononor
Do you know that person identification based on gait detection is actively
developed and getting decent results? Same with voice.

Also in a really grim scenario, could threaten/kill anyone that hides their
face, be it with a skimask, burka or whatever. Or even require positive
identification, that your face is found in the database.

