Hacker News new | past | comments | ask | show | jobs | submit login
UN doc: military drones may have autonomously ‘hunted down’ and attacked troops [pdf] (un.org)
34 points by giuliomagnifico 24 days ago | hide | past | favorite | 19 comments



Page 7 of the linked pdf report:

> Logistics convoys and retreating HAF were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems such as the STM Kargu-2 (see annex 30) and other loitering munitions. The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true “fire, forget and find” capability. The unmanned combat aerial vehicles and the small drone intelligence, surveillance and reconnaissance capability of HAF were neutralized by electronic jamming from the Koral electronic warfare system.47


Don't Air-to-Air missiles already do this? (fire and forget)


What does fire and forget even mean? That the missile follows its target automatically and you dont have to think about it anymore?

Interesting, then the only difference between an „autonomous killer drone“ supposedly uncovered by the UN and a plain and simple fight jet missile is...the duration of the flight of the drone vs the missile?

But then perhaps a missile like an ICBM would take longer than the drone here took? So perhaps even that differentiator does not work.


Fire and forget means it requires no more input/information from whatever fired it. The benefit is that the platform that launched it is free to do other things.


Perhaps a notable difference would be the civilians that could come into proximity of the target from the point of deploying to the point of executing.


Beyond visual range engagements require powerful radar that cannot fit in the missile itself. The missile can only navigate the last leg of it's journey without relying on external information.


I’m a million miles from this area, so I’m interested: [why] is a machine killing people more immoral than people killing people?


So because people kill people, machines should be allowed to kill people? not even logical, I can see an argument being valid as in why shouldn't machines kill other machines. why do you think machines should be allowed to kill people?


> So because people kill people, machines should be allowed to kill people? not even logical, I can see an argument being valid as in why shouldn't machines kill other machines. why do you think machines should be allowed to kill people?

This argument is fatecious and irrational. You can argue whether it's acceptable to kill. Beyond that it makes rigorously no difference whether you kill someone with your bare hands or by pressing a button.


Why do you think they shouldn't be allowed? We have been using machines to kill for a very long time, and this is the first I'm hearing of someone having an ethical problem with the fact that the killing is not done with bare hands.

There may be ethical questions in war, but I don't see why the mechanisms should matter. Perhaps it should but you haven't even come close to making that case.


No offence, but that’s such a back-to-front interpretation of my simple question, it’s almost like you were trying to be contrary.

Maybe my bad for assuming that any reasonable person reading wouldn’t need it stating that killing people is, yes, immoral. And given this shared understanding, the root of my question was why a machine killing people is more immoral.


Well I guess for one: who's held accountable if it kills the wrong people? The engineers? probably not. The higher ups? I'm sure they've mastered the art of weaseling their way out of consequences by now.


We flew B-52s over Germany and Japan and dropped unguided munitions from far above. Britain flew bombing raids at night. Not sure about who was held accountable for killing the wrong people then.

Generally speaking, over time we have become much more accurate at targeting specific people to kill, but we are still a very long way off from 100% accuracy, and none of the issues you raised are particularly novel to drones.


> We flew B-52s over Germany and Japan and dropped unguided munitions from far above. Britain flew bombing raids at night. Not sure about who was held accountable for killing the wrong people then.

I would go a step further and point out the fact that while bombs are fired and aimed, landmines are designed specifically to kill and/or main indiscriminately.


Bombs are "aimed" at a general region that could be a disk with a several mile wide diameter. Similar issues with artillery. Landmines are "aimed" at whatever happens to cross a particular small area. In one sense, they are much more accurate than bombs dropped from a high altitude bomber. But as we all know, they are more indiscriminate in another sense. Point being, all these tools are quite variable in who they kill, and whatever we are doing now with drones is going to be much more accurate than either of these earlier technologies. (I am not arguing that modern warfare is more "ethical" than older warfare, just that the parent poster seems to be a bit late to the party in observing that tools can be indiscriminate in who they kill).


Who decides who the "right people" are and what makes them correct?


> [why] is a machine killing people more immoral than people killing people?

By detaching the moral choice from a person it makes killing more acceptable.

At least that's my take.


Exactly. We never needed machines before to kill each other or commit genocides or kill babies. We even vote murderers into office.


We use "machines" to kill people. Guns, grenades, flamethrowers are all machines much more advanced than their predecessors. But people batted their eyes on the advent of these weapons.

People are going to bat eyes in the advent of these weapons as well. That is not going to stop Pentagon, China, or Russia from using these weapons.

One issue could be that they are not as reliable as target seekers as they are good as target destroyers. And plenty of mistakes have been committed by humans in charge.

Whether autonomous weapons system will be better than seeking and killing enemy combatants rather than innocent people is something to observe.

Right now, autonomous systems are very very unreliable at this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: