Hacker News new | past | comments | ask | show | jobs | submit login

That is one place AI should 100% be regulated. There should be no weapons development where AI makes the kill decision. This is as important as the ban on chemical, biological and mine weapons.

When the weapon has the ability to kill indiscriminately (as in the above categories) it should be banned. AI should never be considered sufficiently able to discriminate to make a kill decision.

If there is no kill decision, just wait and fire, then it is equal to a mine.

100% should regulate / should ban.




Unfortunately there are countries that don't care about your regulations. I'm not even sure a regulation like that can pass in most western countries.


Fortunately it wouldn't be a regulation passed by individual countries. It would be a treaty on war crimes:

https://www.unog.ch/80256EE600585943/(httpPages)/F027DAA4966...

It would likely pass and be agreed to easily just like treaties on mines, chemical and biological weapons.


Just like mines... https://en.wikipedia.org/wiki/List_of_parties_to_the_Ottawa_...

Seeing how the US has yet to sign the Ottawa treaty I wouldn't call it easily agreed on by any means.


OK, but what if the human makes the decision to kill, and the AI just does the aiming?

We already have computer controlled missiles right?




Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: