
Ethics in Lethal Robots - cstejerean
http://lemonodor.com/archives/2008/02/ethics_in_lethal_robots.html
======
ivankirigin
Robots are tools, like land mines and cruise missiles are tools. There is
nothing more complicated than that. They are amoral.

If you want to have the kurzweillian discussion about robots as smart as
humans, then you can't program the ethics, just like humans are too smart to
program ethics.

Papers like this are just completely premature. A discussion about practical &
intelligent safety systems for robotics would be a better way to describe the
intention.

~~~
cedsav
I have just read the introduction, and it does seem that the author is looking
far ahead, but I don't think it's too early to reflect on the ethical aspects
of lethal robots.

Science moves faster than the field of ethics (e.g. cloning) and the US has
already lethal robots on the battlefield. They are not autonomous yet, but
we're certainly going there. When the robot 'assists' the decision by
providing additional data (say an sensor detects a hidden weapon on an
otherwise civilian-looking guy), it already skews the decision process. As the
author mentions, there will be a point when the military will start to think
that their robots are better at making the shoot/not shoot decisions...

~~~
ivankirigin
That's not how the military works.

At least for the next decade, humans will always be making the decisions. I
know. I've met with people in the military about weaponizing robots.

The real safety addition comes from software aided aiming and added
situational awareness. Imaging trying to manipulate a gun remotely with lag,
vs. just saying "fire" to an auto-tracking gun. It's a world of difference.

But this is why I call it a "safety system". It has nothing to do with ethics.
It's like that IR beam that makes sure your garage door doesn't close on
someone. Useful, important, and nothing to do with ethics.

