Hacker News new | past | comments | ask | show | jobs | submit login
Autonomous Weapons Need Autonomous Lawyers (dodlive.mil)
62 points by mhb 21 days ago | hide | past | web | favorite | 57 comments



I disagree with the premise. There are no autonomous robots on their own, there are autonomous robots that someone decided to deploy and target on something. That's just too easy to say that oh, the robots are autonomous so nobody's really responsible for them. "Only robots can judge other robots" ... no, humans judge the actions of other humans. If we don't hold the decision-makers accountable, the descent towards global totalitarian rule seems inevitable.


Or better yet, autonomous weapons need to be prevented:

https://www.stopkillerrobots.org


Unfortunately that is impossible as much as I wish it was possible. Russia, US and China will never risk falling behind in the field.

Maybe we are about to hit the great filter...


It would also be trivial to build one; you can find plenty of homebuilds for airsoft versions of autonomous weapons from the last decade or so.

Pretty sure the day the terrorists (specifically the domestic ones) start building this kind of crap is the day public spaces stop being a thing.


The reason public spaces are safe is not because it's hard to randomly kill people in public spaces, but because most people don't want to randomly kill people in public spaces.

This is unlikely to change much, even as autonomous weapons get easier and cheaper to make.


That's not correct.

The reason public spaces are safe is because it's hard to randomly kill people in public spaces and get away with it

While you are correct that most people don't want to randomly kill people in public spaces a non trivial amount of people do.

It's only the thin slice of that small group that want to kill people and don't care if they live after that we've had to worry about to date, and we see them in the form of suicide bombers or that Las Vegas shooter from last year.

We will have much bigger problems when it's possible for people to kill large swathes of innocent people and have a high probability of getting away with it.


>We will have much bigger problems when it's possible for people to kill large swathes of innocent people and have a high probability of getting away with it.

You have just described every war and oppressive dictatorship in human history.


In case it wasn't obvious from my comment when I used the term 'people' there I was referring to individual non-state actors.

A drone or a drone swarm is an incredible force multiplier that will allow individuals to commit atrocities and have a significant chance of not getting caught which will make enforcement and prevention exceedingly difficult if not impossible.


Sure...in peacetime.


The limitting factor of terrorists attacks is not the lack of young men willing to die perpetrating them.


I think we did already. It will just take time to materialize .


It’s not impossible. We always have a choice.


It's naive to say that. We as individuals do, but you cannot control every individual in the world effectively. It's best to assume that it will happen and try to put safeguards in effect.


But I can't choose for you and vice versa. Agreements and treaties do a good job but a cursory reading of history indicates that they can also be violated.


There is a bit of a positive side to this - if our side uses murderbots, and their side uses murderbots...why bother with humans?

I hope, far in the future but not too far, wars are held among the stars with remotely operated or even autonomous starships battling over resources such as meteors or comets. It is in every nation state’s best interest to reduce the human and property cost of every battle.

The need for conflict will likely be forever a part of humanity. The need for death or injury, however, is quickly withering away.


Because the first party that cheats and starts killing other side's people wins. Settling wars by game (whether a robot conflict, or a videogame) is an unstable equilibrium, because it requires all participants to obey the rules. "War" is generally what we call when a faction throws out the rulebook.


...no? Modern warfare very much has rules and laws governing what is permitted in international conflict violence, and what constitues a war crime, and how severe that war crime is. War is absolutely not "one or more parties throwing out the rulebook". As easy as it is to just murder a city in civ 6, in real life any power that wishes to still be sat at any relevant table during future treaties and negotiations will over those rules and laws unless they are truly desperate. I.e. they have already lost the war, and now it's a reign of terror on their way out.


That works when one side has an overwhelming power advantage, or when two sides are fighting a proxy war with little skin in the game. The dominant side(s) can afford to enforce rules that minimize its losses and the chance for escalation, and enforce those rules on the other party because, well, it's dominant. Basically all wars since WW2 have been fought on this template.

It goes out the window when there is actual uncertainty about who might win. Unrestricted submarine warfare was considered a war crime for most of peacetime, but as soon as it became a crucial strategic advantage that might turn the tide of the war, both sides started using it. The atom bomb would probably have been a war crime if it had been the losing side using it; as it is, it remains controversial.

Once one side wins the war, they become the dominant power again, and can enforce a definition of war crimes favorable to themselves. Such an event actually occurred with unrestricted submarine warfare: Admiral Nimitz testified for Karl Donitz at the latter's war crimes trial, because had unrestricted submarine warfare remained a war crime, the U.S. would have been guilty of it too.


'enemy combatant'

'disposition matrix'

'gitmo'

'enhanced interrogation'

These are all gross violations of both the spirit and letter of international 'law', and was done by the most powerful country in the world.

Who can possibly believe war has rules that aren't completely ignored? Do people just ignore the horrific crimes committed by their own governments?


what do these clear rules and laws say about the following?

- US blows up various hospitals because "may contain insurgents"

- US blew up US citizen + his child without due process because "terrorism"

- US sanctions countries it doesn't like, causing civilians to starve/suffer/lack medicine

seems evident to me that one party doesn't really follow "the rules". how come they have a permanent seat at the UN security council? what punishment is prescribed for rule breakers? because it clearly isn't any loss of influence/prestige/sanctions/etc.


Tell that to the allies after WW2.

https://en.wikipedia.org/wiki/Disarmed_Enemy_Forces


> Modern warfare very much has rules and laws

I have said this before, and this looks like a prime spot to say it again:

The only people who play fair are those who don't know how to cheat well enough.


Or those who care more about the game than about winning. Which, in war, is only the case for the side that's so sure of its victory it doesn't matter whether they cheat or not, so they can play the virtuous party for good internal and external PR.


Funny thing, that.

If you are in a war and are in the lucky a situation that you can prolong the conflict without much negative effects, "keeping the game going" is in fact waging a war of attrition.


there has been no real warfare since WW2.


That's true in theory, but rules of war do have a long history of limited but very real success. The key is to keep a multipolar civilization so that the cost of defecting from the rules (for example, the Geneva Convention) is higher than the benefit of getting an edge in the current conflict.

Even in a bi-polar conflict, the threat of mutually assured destruction helps to enforce official and unofficial rules of engagement. The Cold War demonstrated that quite effectively.


AFAIK this history is less than 100 years old, per what 'nostrademons elaborated on in the thread. From what I remember from my history lessons, most wars - including both World Wars - were free for all, with (what we consider today) war crimes committed left and right. And it's only expected: any party that believes they're fighting for their very survival will not stick to the rulebook unless consequences are greater than benefits, and they usually aren't.

I mean, the proposal here is equivalent to saying "let's solve all our disputes by a deathmatch in Quake 3 Arena" - stated this way, it's more obvious that the only thing stopping a party from cheating is threat of consequences in the real world. And once you have threats and counterthreats, you may as well do away with the Q3A game.

Hell, it seems to me that in a way, we're already doing "conflict resolution through games". Geopolitics is the name of the game, and diplomats and politicians are the player. There's a full fluidity from the purely virtual threats, through sanctions up to trade wars.


the only reason people were killed en masse was because they were in the way of strategic interests, and the strategic interests were in the densely populated areas and accessed by land and crude aerial targeting, and the only way to defend these things was to stem the advance using other people.

many of these things are no longer true.


The side that can destroy the opponent’s ability to build, deploy, and develop murderbots will have a big advantage. So they will target the murderbot economy and harm people in the process.


>So they will target the murderbot economy and harm people in the process.

Yeah, you can kind of see the escalation tree. Inevitably, there's a point at which some book smart people come up with the idea to use AI techniques to have the murderbots protect their own reproduction capacity.

I hope everyone who decides to manufacture murderbots is wise enough to keep an "Off" switch of some sort.


Furthermore, while human life was "nasty, brutish and short" dying "in a blaze of glory" made some sense, trading just a few years. But as we live longer and and happier lives, that trade will be viewed as pathologically short-sighted.


In the Middle Ages, I understand that if you survived childhood then the demographics were largely what they are today. Its just that today, more people are surviving to adulthood?

So it never made 'sense' to die in a blaze of glory. Except for the immortality part.


>It is in every nation state’s best interest to reduce the human and property cost of every battle.

It won't happen, drones killing drones will never be as effective a means of coercion than drones killing people. Human suffering is what makes war effective.


What if one community has money to spend on drones, While the other does not? What if both communities have enough money and value human life so they just send endless drones at each other smashing said drones into pieces until one community has enough money to spend on drones and one does not It could be about power, and from what I've seen, the people that come from that: don't like to not come from that complacency is a word that comes to mind


Imagine people build super weapons and super defenses. With the push of a button you can kill 1000 people. With another you can defend the lives of 1000 people. As the arms race progresses the impact of the offense and defense tech grows. One day the weapons are strong enough to kill a billion people and defend a billion people.

One side runs out of defensive drones. How good is their negotiation position? They would have zero power.


Counter hypothetical: Nations have drone-centric militaries. Nation X moves to attack nation Y's major cities. Y notices and sends its drones to intercept and defend in locations away from the cities. If X or Y lose the ability to fight back, why wouldn't they surrender?

I don't see how the decisions are different from our current wars where humans carry the weapons.


Nation Y is still defending human beings and human territory, so the incentives remain the same.

What's not going to happen is X and Y agreeing to send their drones out into the desert or somewhere away from people to fight things out. War is the external application of a nation's monopoly on force - it just doesn't work unless that force is applied, directly or indirectly, against human beings.


I still don't see the difference between sending humans to attack cities with lethal force versus sending drones to do the same. It's how war's been waged for centuries. World War II took place in cities and fields. No reason to think drones would change the strategies we've used since combat planes.


Wars happen due to material and ideological differences. If country A engages country B in a robot war and all the robots die, the humans and the differences will remain.


A big part of war is morale. People have to want to send their 20 year old children to die in a foreign country. No war happens without support from the people fighting it. But what happens to war when that's not a thing anymore? What will happen when we just have a bunch of robots killing each other? Nobody will have any personal reason to protest or speak out against a war. It's the ultimate abstraction.

You could call this a good thing or a bad thing, but I'm not smart enough to figure out which side is more likely to be true.


If all belligerents had access to robots for fighting each other; if only robots could be used in warfare; and if the outcome were to have no effect on the welfare of the people, then the people would be impartial as to the outcome. If the outcome didn’t matter, there would be no reason to even manufacture the robots, let alone send them to fight each other. Because only robots could fight, war would be moot.

My point was that unless people change, the causes of war will remain.

“No fate but what we make.” -Terminator 2: Judgement Day


Most "modern warfare" is asymmetric in nature where one side does almost all the murdering and the other side mostly just gets murdered. It may seem to you that the, "need for death or injury" is quickly withering away but I invite you to go ask someone from Iraq, Libya, Syria, Afghanistan or any other country that has been on the other end of it what they think about death and injury.


You don't see a situation where one side can mass produce bots and completely dominate (zerg...)? Such a side would likely also force the losing side into slavery or starvation



Historically weren't some wars decided via a duel between two champions? Or is that just fiction?


David vs Goliath, Achilles vs Hector (war did not end); that's about it. My understanding is that this occured only in mythical/very ancient contexts.


why not just do it in a video game instead


This is interesting, but there are difficult conceptual problems with imagining that supervised learning can just be plunked into legal reasoning. I have a draft chapter on this, if folks are curious (or have feedback!): https://osf.io/preprints/lawarxiv/gk2ms


If we can't keep up with our killing machine's actions, maybe we shouldn't have them?


How dystopian - our murderbots will be held accountable by our lawyerbots.


All we need now is a simulation to plug ourselves into. And this is how The Matrix was made.


It's sorta like AdTech, where machines carefully optimize the ads shown to other machines and blocked by humans, with each layer charging humans for the privilege.


Most humans don't block ads though.


Then the lawyerbots will start moving too fast, too, so we'll need laws written and approved by congressbots.

You know, maybe the Unabomber had a point there...


Horrifying.


Interestingly enough, dangerous technology has a negative bias towards morality, meaning that an actor less concerned with moral aspects of advancing such technology incurs less resistance in doing so.

If US military succumbs to public pressure and stops developing autonomous weapons (purely hypothetical, never gonna happen), Russia or China will not, and we end up in a situation where the least moral actor has the most powerful and advanced drones.

There's a similar situation in eugenics/genetics and other morally dubious research areas.


Man, the references to AWS threw me off




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: