
Autonomous Weapons Need Autonomous Lawyers - mhb
https://reporter.dodlive.mil/2019/04/autonomous-weapons_law/
======
ajnin
I disagree with the premise. There are no autonomous robots on their own,
there are autonomous robots that someone decided to deploy and target on
something. That's just too easy to say that oh, the robots are autonomous so
nobody's really responsible for them. "Only robots can judge other robots" ...
no, humans judge the actions of other humans. If we don't hold the decision-
makers accountable, the descent towards global totalitarian rule seems
inevitable.

------
yboris
Or better yet, autonomous weapons need to be prevented:

[https://www.stopkillerrobots.org](https://www.stopkillerrobots.org)

~~~
cooljacob204
Unfortunately that is impossible as much as I wish it was possible. Russia, US
and China will never risk falling behind in the field.

Maybe we are about to hit the great filter...

~~~
wayoutthere
It would also be trivial to build one; you can find plenty of homebuilds for
airsoft versions of autonomous weapons from the last decade or so.

Pretty sure the day the terrorists (specifically the domestic ones) start
building this kind of crap is the day public spaces stop being a thing.

~~~
jstanley
The reason public spaces are safe is not because it's hard to randomly kill
people in public spaces, but because most people don't want to randomly kill
people in public spaces.

This is unlikely to change much, even as autonomous weapons get easier and
cheaper to make.

~~~
Teever
That's not correct.

The reason public spaces are safe is because it's hard to randomly kill people
in public spaces _and get away with it_

While you are correct that most people don't want to randomly kill people in
public spaces a non trivial amount of people do.

It's only the thin slice of that small group that want to kill people and
don't care if they live after that we've had to worry about to date, and we
see them in the form of suicide bombers or that Las Vegas shooter from last
year.

We will have much bigger problems when it's possible for people to kill large
swathes of innocent people and have a high probability of getting away with
it.

~~~
imtringued
>We will have much bigger problems when it's possible for people to kill large
swathes of innocent people and have a high probability of getting away with
it.

You have just described every war and oppressive dictatorship in human
history.

~~~
Teever
In case it wasn't obvious from my comment when I used the term 'people' there
I was referring to individual non-state actors.

A drone or a drone swarm is an incredible force multiplier that will allow
individuals to commit atrocities and have a significant chance of not getting
caught which will make enforcement and prevention exceedingly difficult if not
impossible.

------
nexuist
There is a bit of a positive side to this - if our side uses murderbots, and
their side uses murderbots...why bother with humans?

I hope, far in the future but not too far, wars are held among the stars with
remotely operated or even autonomous starships battling over resources such as
meteors or comets. It is in every nation state’s best interest to reduce the
human and property cost of every battle.

The need for conflict will likely be forever a part of humanity. The need for
death or injury, however, is quickly withering away.

~~~
TeMPOraL
Because the first party that cheats and starts killing other side's people
wins. Settling wars by game (whether a robot conflict, or a videogame) is an
unstable equilibrium, because it requires all participants to obey the rules.
"War" is generally what we call when a faction throws out the rulebook.

~~~
TheRealPomax
...no? Modern warfare very much has rules and laws governing what is permitted
in international conflict violence, and what constitues a war crime, and how
severe that war crime is. War is absolutely not "one or more parties throwing
out the rulebook". As easy as it is to just murder a city in civ 6, in real
life any power that wishes to still be sat at any relevant table during future
treaties and negotiations will over those rules and laws unless they are truly
desperate. I.e. they have already lost the war, and now it's a reign of terror
on their way out.

~~~
bostik
> _Modern warfare very much has rules and laws_

I have said this before, and this looks like a prime spot to say it again:

The only people who play fair are those who don't know how to cheat well
enough.

~~~
TeMPOraL
Or those who care more about the game than about winning. Which, in war, is
only the case for the side that's so sure of its victory it doesn't matter
whether they cheat or not, so they can play the virtuous party for good
internal and external PR.

~~~
bostik
Funny thing, that.

If you are in a war and are in the lucky a situation that you can _prolong_
the conflict without much negative effects, "keeping the game going" is in
fact waging a war of attrition.

------
paultopia
This is interesting, but there are difficult conceptual problems with
imagining that supervised learning can just be plunked into legal reasoning. I
have a draft chapter on this, if folks are curious (or have feedback!):
[https://osf.io/preprints/lawarxiv/gk2ms](https://osf.io/preprints/lawarxiv/gk2ms)

------
cameronbrown
If we can't keep up with our killing machine's actions, maybe we shouldn't
have them?

------
fwip
How dystopian - our murderbots will be held accountable by our lawyerbots.

~~~
nostrademons
It's sorta like AdTech, where machines carefully optimize the ads shown to
other machines and blocked by humans, with each layer charging humans for the
privilege.

~~~
cameronbrown
Most humans don't block ads though.

------
bloody-crow
Interestingly enough, dangerous technology has a negative bias towards
morality, meaning that an actor less concerned with moral aspects of advancing
such technology incurs less resistance in doing so.

If US military succumbs to public pressure and stops developing autonomous
weapons (purely hypothetical, never gonna happen), Russia or China will not,
and we end up in a situation where the least moral actor has the most powerful
and advanced drones.

There's a similar situation in eugenics/genetics and other morally dubious
research areas.

------
alottafunchata
Man, the references to AWS threw me off

