Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Call for ethical debate surrounding use of "killer robots" (bbc.co.uk)
18 points by theblackbox on Aug 4, 2009 | hide | past | favorite | 30 comments


No one should consider themselves informed on the subject of "killer robots" without having read this:

http://vinay.howtolivewiki.com/blog/global/the-second-amendm...


"What this effort will do, if it is successful, is defang ... all populations, overthrowing the protective effect individual firearms ownership, destroying the intent and effectiveness of the Second Amendment, and unbalancing the Constitution permanently through evolving technology which end-runs around the original intent of the Framers."

Hasn't this already happened? Iraq very clearly demonstrates that a massive force will overrun you no matter how many arms your citizens possess. Citizens usually don't possess missiles, anti-tank weaponry, large-caliber arms, etc.

You don't need significantly advanced technology; you just need slightly better weapons, and slightly more of them.

The flipside, though, is that we could reinterpret software and hardware as arms; if robots are standing on the street corners, the ability to fight back electronically becomes the only option.


second that, it takes a while to read it though, it's a long piece but well worth it.


Seconded. Although it focuses mainly on the US, consider too the implications in a society like the UK where surveillance cameras are extremely thick on the ground.


"The problem, he said, was that robots could not fulfil two of the basic tenets of warfare: discriminating friend from foe, and "proportionality", determining a reasonable amount of force to gain a given military advantage."

That is true for autonomous robots, but we are currently not there yet. And if the choice is between the UAV overhead with cameras and sensors on the target firing guided missiles, or the UAV overhead with a camera and someone firing artillery at the target from a few miles away, the UAV is more accurate and less likely to cause collateral damage.

He also talks about how they are further distancing the soldiers from battle. Not really, if you are a normal pilot you drop the weapons on the target and get the hell out of there, and don't see really any of it except on film later because you are moving so fast. The UAV operators keep the UAV overhead and are watching before, during, and after the attack much more closely than pilots. It turns out it can be more emotionally damaging than manned flight. Especially because of the lack of danger for you, it triggers different emotions.


The US better get out in front on this and engage in the debate and get something they can sign. Elsewise the result will be something impractical and both 'good' and 'bad' guys will get a pass.

Don't pretend everyone won't be able to afford this because robots will be cheap enough eventually for both sides to go this automated.

Engagement is the better option.


The lower the price of something the higher the volume.

Risking yourself while killing somebody else increases the price on your side. If that risk disappears the 'cost' gets so low that the volume might increase dramatically.


On the flip side, the cost of caution also decreases.

Scenario: a person is approaching a checkpoint and does not obey verbal commands. No civilians are present near the checkpoint.

Human soldier: "Oh shit, shoot him, I don't want to get blown up!"

Robot soldier: "Direct extra scrutiny towards target. Do not allow him to approach civilians. Allow him to approach the checkpoint. I only cost $100,000, no reason to kill a human to save my metallic skin."


Wrong. The manufacturing cost of the robot may be a mere $100,000. But besides the argument that human lives are often valued at less than that, there's also the security argument: 'we can't allow our technology to be stolen and exploited by the enemy'. I can pretty much guarantee you that one will get trotted out. It's not very logical, but that has never mattered in war.

I mean, let's not be naive here. When the fighting in Iraq was intense a few years ago, the biggest threat to coalition (and especially US) soldiers was 'IEDs' - improvised explosive devices, which in turn was a fancy name for 'homemade mines'. Generally these were triggered by cellphone and there were many comments about how cowardly and evil this was (because it was happening to us). Imagine the reaction if these had been mobile devices - or more accurately, when they are. Because while there are many engineering challenges involved, there's nothing fundamentally difficult about it.


The tragedy is, to the people calling the shots (ie. the invading military or government), a foreign civilian's life would probably be worth far less than $100,000.

As a previous comment notes, civilians killed are thought of as 'collateral'. That's just not acceptable. The burden lies with the occupying/invading forces to do everything possible to make sure civilians aren't killed or harmed in any way. Even actions that would lead to rations being withheld or basic needs not being met could be considered a breach of international humanitarian law and the fourth Geneva Convention.


I'd support a Geneva Convention-style ban on robotic warfare.

on second thought, not an outright ban, but there should be some serious limitations set in place before we completely dehumanize the process.


Honestly, I don't think this will make any difference. The Geneva convention seems to left on the shelf in cases of asymmetric war, not least because the weaker side is highly unlikely to be in a position to prosecute violations afterwards.


Professor Sharkey shows a great deal of social and political insight for someone whose academic specialization is in neither of those domains. Let's hope he continues to make these points heard.


"While 14 al-Qaeda were killed, some 687 civilian deaths also occurred…"

That is profoundly disturbing. How can governments justify the continued use of drones with statistics like that? There needs to be an ethical debate, yes, and urgently.


It's not the drones, it's what they consider a fair exchange. Guerrilla wars are like that.


Ask the families of the 687 civilians if they thought it was a fair exchange...

It's the de-humanizing use of terms like 'fair exchange' and 'collateral damage' that make this kind of atrocities possible in the first place. If we just called it by its name (cold blooded murder by remote control) then it would be looked at quite a bit differently by the general public.


I think his point is that the only thing new here is drones. The numbers themselves are old and that is what we should be correctly outraged about. The exact type of weapon used is not relevant. (Except for things like white phosphorus, obviously.)


I think drones are a major game changer. They increase the distance between the humans on the receiving end and the humans on the 'giving' end to such an extent that there is now a major disconnect. Regular fighter jocks at least put their own lives on the line as well, drone pilots could literally do their jobs from the other side of the planet while eating donuts.

It doesn't matter much to a victim what weapon they get killed by, dead is dead. That doesn't mean we should not outlaw certain types of weapon (white phosphorous is an excellent sample for that, but drones would qualify as well imo).

My guess is the parties that 'have' the drones would never agree to give them up on the insistence of the 'have nots', it's too much of an advantage. War becomes a simple decision, none of those pesky losses to explain to the homefront. Only some energy and steel lost, there's plenty of that to go around before the homefront will realize there is a cost to war.


The military's job is (essentially) to save some people's lives by killing other people. It is a distasteful job, but (most people agree) a necessary one. Therefore what the military needs to be good at is killing targeted people with minimum loss of life to others. "Others" includes members of the military.

If drone guns result no more loss of civilian life than other methods of attack, but with less risk to the operators' lives, they make the military better at its job.

If there are too many civilian casualties, the solution should not be "make the military less good at its job". It should be "make the military's job less necessary".


There is a difference between fighter pilots and drone pilots but not between drone pilots and who's ever job it is to push the launch button on any one of many missile types we'd had much, much longer then we've had drones. So war = bad, drones = red herring.


I disagree. Almost always, missiles are targeted against fixed military installations. Of course there are tactical missiles fired between aerial and ground targets, but again these are optimized for and usually deployed against hardened military targets like tanks or gunships.

The problem here is not one of drones firing at columns of tanks, but the fact that they are being deployed as anti-personnel devices targeted at guerrilla leaders. I am OK with targeting such people (and as it happens, I support the idea of fighting and defeating the Taliban in Afghanistan, but I'm very disappointed with the implementation). The kind of drones we use now are basically lightweight planes, and given the physics of fixed-wing flight, that means fly-bys and high-yield single-use weapons.

If your intel is good and you have found an isolated Taliban training camp, then OK. But if it's poor, you're throwing a lot of destructive power at the wrong target. Whereas a sniper team might employ a scope or long-range microphone and observe the presence of many women and children or singing and dancing (conclusion: might be a wedding party), a drone on flyby can identify the existence of a target, but is poorly equipped to identify the nature of the target...which is one reason we've blown up a lot of wedding parties in the last few years.

Realistically, we can get away with it to an extent because the US is a big powerful country that can throw its weight around (and is allied with other relatively big and powerful countries). But a 50:1 kill ratio for civilians:bad guys is piss-poor - even if you assume a degree of dishonesty and propaganda on the other side, a ratio of 25:1 or even 10:1 is still piss-poor and exactly the sort of thing held up as an example of moral failure in history. By depending so heavily on tools which do not allow easy discrimination of military and civilian targets, we weaken our freedom to act effectively and early in other contexts.

I hope it's clear that my argument here is economic rather than political.


In fact the 50:1 ratio will probably create 2 new fighters for every one killed. Most people have families and don't like seeing their relatives blown up. Do not put people in a position where you've taken away their every reason to live for.


excellent point. thank you.


The exact type of weapon used is not relevant. (Except for things like white phosphorus, obviously.)

It isn't obvious to me why incendiaries (such as phosphorus or molotov cocktails) fall into some special class of "bad" weapons. Could you explain?


For one thing, the use among civilian populations of white phosphorus as a weapon (rather than as part of a smoke screen) is of dubious legality; see http://en.wikipedia.org/wiki/White_phosphorus


I think the legality is pretty clear. Protocol III of the Concentional Weapons Convention bans incendiaries under many circumstances, but the US has not adopted it.

http://en.wikipedia.org/wiki/Convention_on_Certain_Conventio...

http://www.icrc.org/ihl.nsf/NORM/C1C4BFCF736BF820C1256402003...

Some activists also wish to call it a chemical weapon, but by that logic lead bullets are also a chemical weapon (since lead is toxic).


Britain has. Using or taking part in an action which uses American WP munitions could (theoretically) be a court-martial offence.


Bombs don't do a good job of discriminating between soldiers and civilians, so that's bad.

Bombs which don't so much explode as burn you alive are even worse.


That's the wrong question to ask. A drone is just a regular combat aircraft whose pilot operates it by remote. If you get bombed by an F16 or bombed by a Predator drone, it's a human that flew it there, it's a human who gave the order to attack, it's a human whose finger pressed the trigger. There is NO autonomy in any of the hardware in either case.


This seems like a highly questionable conclusion. 'No autonomy' means (to me) no kind of technological assist whatsoever, not just not just no executive power.

Consider the humble digital camera. You likely have one close by. My old point-and shoot has 'intelligent' control of focus and aperture, and is 'smart' enough to look for shapes resembling human faces and prefer them as focus targets. Newer models on various cameras include recognizing babies (as distinct from adults), tracking moving objects and so forth. Now, if you have any experience with a manual (film) camera you can't help but be aware of how useful these new tools are, although you might not depend on them. I chose cameras as an example because they're cheap - almost commoditized, in fact - and the degree of 'assist' offered by the technology is really quite startling, and furthermore seems to be accelerating with every new generation of camera.

Back in the military context, consider the hierarchical nature of military command. Say you are a submarine commander in wartime. Your job is to sink enemy ships, but you don't want to accidentally sink any ships belonging to allies or neutrals, and (to a lesser degree) you want to avoid sinking even the enemy's passenger ships, where the military gain would be marginal but the propaganda cost to your side would be significant.

You, the commander are the executive who makes the decision to fire a torpedo. but to do so, you rely on a mix of incoming intelligence about shipping movements, the judgment of sonar operators about the signature and identity of passing ships, a larger staff who orient your craft and ready your weapons, and finally a small officer staff whose functions are to implement your orders and also to draw your attention to possible error. You, as an executive, are replaceable, mainly by necessity but sometimes by choice; the existence of a tiered officer class in combination with a military code serves as a useful check on excess, though not a foolproof one.

Now as far as drones go, the operator is equivalent to the executive. However, the operator's decision is only as good as the quality of information relayed to him or her by subordinates. In this case, the subordinates are increasingly heading towards being technological systems. I would really be surprised if these UAVs did not include some degree of auto-targeting based on high-amplitude infrared signatures or so. We know (from our experience with cameras, and many other contexts you can think up) that real-time signal processing is both affordable and ubiquitous, and we also know that good systems design involves leveraging technology to increase efficiency wherever possible.

So the point (finally...sorry) is that the human autonomy you cite is probably somewhat less than total, insofar as it relies on technological assists which aid in the selection and tracking of targets. This is a very marginal loss of autonomy, to be sure - one that may seem so slight as to be negligible, and indeed one which is often positive insofar as it may reduce human error. But:

a) as we all know, technology frequently displaces skills to the extent that such skills can systemized, and the military is all about doing things systematically

b) it's relatively easy to program a machine to listen/look for the signature of (say) automatic gunfire, but we are a long way from being able to explain to one that in some cultures, this is considered a good way to celebrate a wedding.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: