Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The only sensible thing to do is to pass laws that guarantee that the people who implement autonomous weapons are blamed for anyone who is wrongfully killed by them.

Because if you have someone pulling the trigger, you know who's to blame. But if a computer's doing it, it's oh-so-easy to shift blame.

Honestly, with that requirement in place, either AI weapons will never be implemented (because they can't prevent wrongful deaths) or they'll be perfectly implemented (making the world safer, possibly).

Passing a law like that could possibly lead to a win-win situation.



Which will punish researches in countries that uphold that laws, but won't do anything against countries that will either refuse to agree on that or plain cheat. So, the countries that will sign and uphold that treaties will find themselves lacking a very important and advanced technology that their adversaries have.

So, the question is: do you want countries that either don't care about humanitarian values, or only pretend to care, to have an upper hand compared to countries that can not only pass such laws, but also to enforce it?


Countries that refuse to listen to others are currently correlated with having technological programs that are far from the leading edge, as well as low GDP, making them basically unable to implement any sort of AI weapon.

We have nothing to worry about. This is the same kind of reasoning many use to exclaim that terrorism is a threat to the United States. Really? http://www.state.gov/j/ct/rls/crt/2014/239418.htm


This was true in the end of twentieth century, but it is changing now. China, India, Iran, Russia are examples of such countries. Each one has serious domestic problems and is, individually, little in comparison with US/EU economy, but nevertheless, they have enough resources for AI research, which is significantly cheaper than nuclear/rocket research, for example.


I concur with your thoughts; the big difficulty with enforcement here (beyond nukes) is figuring out how to detect when someone breaks the rules. The Fourier Transfrom was developed to help determine if nuclear weapons tests were taking place (https://www.youtube.com/watch?v=daZ7IQFqPyA); it's very hard to figure out how you'd detect AI testing in order to enforce any treaties...


> The Fourier Transfrom was developed to help determine if nuclear weapons tests were taking place

{{citation-needed}}

http://math.stackexchange.com/questions/310301/how-was-the-f..., https://en.wikipedia.org/wiki/Fourier_analysis#History


> The only sensible thing to do is to pass laws that guarantee that the people who implement autonomous weapons are blamed for anyone who is wrongfully killed by them.

This only works if you can ensure that everybody obeys the law. You can't. The primary threat here is not some rogue militia in the US building autonomous weapons; it's a country like Iran or North Korea, that doesn't give a rat's ass about laws, building autonomous weapons.


So what will an open letter do to stop it?

My assumption is that open letters only work on people who are willing to rationally discuss alternatives. The countries you mentioned probably don't care about anything.

I'd like to see change at least in the US. I'm not worried about "rouge" militia either, I'm worried about the Army et al.


> So what will an open letter do to stop it?

Probably not much.


Just like all the drone operators who hit civilian targets and were Court Martialed.


I'd rather the blame go to the implementers of the policy rather than the executor, because the implementers often have the most power. But yes, I admit you are correct, and I wish it were not this way.


Why stop with the implementors? Why not also blame the janitors that clean the office they work in? Why not blame the project managers to oversee the project? Or the politicians who approved funding for the technology? Or the voters who voted the politicians into office?

The human need to assign blame to a single source is an evolutionary remnant that we should be aware of and try to correct for, rather than embracing it. Just because something is terrible doesn't mean you can put all the blame for it on one person or group of people.


Blame whoever has the power to stop it. I wouldn't blame a janitor because they likely have nothing to do with what's happening. An interesting form of slippery slope.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: