It's not like nuclear weapons, where you need a huge industrial operation with tens of thousands of high-tech centrifuges running for years before you have enough fissile material for a bomb. Even the rulebreakers, like Israel, couldn't hide the fact that they were doing it and had to rely on America's UNSC veto to save them from the consequences.
But lethal robots are a trivial step away from the current state of robotics. Hobbyists have already mounted guns on quadcopters. Tesla and Waymo have made robots that kill people accidentally. Any robot can be made a killer robot with minimal change to its software and hardware.
The ban isn't a flip-switch that says: "hey, no more of these", it's the start of a formal dialogue over our relationship to these new technologies and how we would like to see them used on a worldwide basis.
If a country were to opt to drop thousands of flying drones with guns onto a military base is that different than chemical weapons? a fuel air bomb? conventional explosives? sending special forces in?
Yeah, they're all different with different repercussions, long term impacts, potential civilian casualties, etc. etc.
I think we can pretty much count on every group with a gripe reaching for autonomous drone swarms first in the future.
They do run into the problem you described. Most militaries have them. Paramilitaries and sub-state forces (even criminals) have them. They sometimes get used, even by relatively law abiding forces.
Despite all that, those treaties do significantly impact their use in various ways. Diplomatic concerns do affect choices in conflict. These treaties put a diplomatic price on the use of certain weapons. They limit trade in these weapons.
Weapons treaties are not about perfectly police able, and internally consistent rules. No rules of war are. That doesn't mean they are entirely worthless.
Not much point engineering something you can't sell.
It's not a perfect solution but its a positive outcome.
It also puts diplomatic and PR pressure on countries who use weapons generally prohibited which also is good.
The problem generally is that technology continues to improve and the cost of doing things continues to drop.
I've no idea what it would cost to build a cruise missile using off the shelf components and props instead of jets (and I'm not going to Google it) but on YouTube there are people operating remote control aircraft at long ranges with love video feedback, that's a payload away from a smart bomb already.
In reality targets are soft enough that you don't need to do those kinds of things to attack a civilian area but that doesn't mean they couldn't.
The recent drone at Gatwick shows the problems.
I don't think Israel (or India, Pakistan & North Korea) broke the npt rules. The treaty doesn't regulate non signatories and allows countries to withdraw from the agreement. North Korea is the recent one. They gave their 3 month notice, withdrew and tested nukes without violating the treaty.
Ultimately, I don't really think the treaty was the biggest direct factor limiting proliferation.
When it was signed, peaceful nuclear tech was expected to be the most important technology of the century. The treaty was supposed to exclude non-members from accessing peaceful nuclear technology. It did that quite effectively (Israel never had nuclear power for this reason).
But unexpectedly, it's 2019 and nuclear energy isn't that important.
The npt did help forming a consensus though, and policing that consensus outside of the treaty. Helping anyone nuclearize is taboo. Five more countries (possibly six) did develop nuclear weapons after the npt, but the barriers were pretty huge, diplomatically and technically... because they got very little help.
Only because most of the nations that haven't got them but are seeking them are poor nations that don't have a lot of close friends among the nuclear nations. It's kind of a "got mine, screw you" deal. Iran and the like have just as much right to develop and hold nuclear weapons as everyone else.
The thought of "regional troublemaker" having nuclear weapons countries bothers nations like the US, Russia, France, (etc) for the same reason some people at the top of the social hierarchy doesn't want poor people to have easy access to firearms. Those already at the top generally don't like seeing the playing field leveled and it's true all the way from the individual level to the national level.
In a democratic society if the party you want to push around is well enough armed to hurt you back that limits your ability to push them around to cases where you are actually consider to be "in the right" by the court of public opinion otherwise the people will promptly decide that it's not worth it.
China, UK & France are in. Japan & Germany are out, because they were occupied and demilitarised at the time. Both large, rich countries. China was large, but poor at that time.
Everyone turns a blind eye to Israel (they're in the in group) but if any middle eastern, African or South American nation were to seriously pursue nuclear weapons the hand wringing would be immense (and is, in the case of Iran).
The effectiveness of enforcement is no argument.
Inability to enforce is exactly why prohibition has been a colossal failure, so how is the ability to enforce irrelevant to a law's success?
We collectively agree to ban certain things (e.g. murder), because there are multiple reasons to do so. Some of these reasons are humanitarian, some rational, some economic etc. Yet we see proof each day that neither laws nor punishment will stop murder from happening.
The idea of a ban is not to stop things — but to increase the (social, economical, ...) cost of certain behaviour.
Any actor who fears that cost will abstain from chemical weapons for example. But just like with murder you will always have actors that either decide it is a prize worth taking, or they never ever thought about it at all.
Hard to say how effective those bans were, but they certainly helped to nudge some actors into adopting higher standards.
And once there is a standard the majority agrees on it is hard to go back to a lower standard..
Politicaly costly, as more and more people pressured for more human ways of war.
Because mines are a very effective and cheap military strategy. (where mines are you don't need so much troops). I say it was the boulevard pictures of children without legs, that did it. And maybe will so again after the first killbots gone rouge..
Because I doubt any big military would miss out the opportunity to at least be able to flip the switch to let them operate and shoot autonom,
when you have more cheap massproduced robots than operators and need them and or the enemy is jamming you and the situation is critical.
Land mines are by far the cheapest modern method for area denial unless we're considering dirty bombs.
We’re hoping society agrees that making autonomous lethal robots should be Bad and deserve punitive consequences. And hopefully, those who commit this crime will be caught and suffer the consequences.
Laws should be clear and non-burdensome for the majority. And violations caught at the high rate. My 2c.
The first case they may bury bodies and set an armed patrol of some sort for roving bandits but the murderer would never be punished per say.
However if nearly everyone drinks and alcohol is illegal but bars operate openly yeah that law is bad it is only being used as a pretext.
No human agreement will ever be able to stop a 100% of anything. But that doesn’t imply it isn’t worth increasing the cost (financial, social and otherwise) to certain types of behaviour.
People seem to agree quite uniformly with banning and punishing murder for example, although these laws and punishments don’t seem to stop murder as has been proven over centuries. Few would murder their neighbour over a small dispute, because the cost for doing so is extremely high. Much more people would e.g. kill the rapist of their child, because the notion of cost might be irrelevant under such circumstances and it is much more socially acceptable to do so.
Of course there is more than just cost. It can also be a matter of culture. If you live in a culture were getting shot for miniscule reasons is not something that happens, you are also far less likely to see shooting at somebody for miniscule reasons as an option
Just like the world can mostly agree it was evil to arrange a civil war in Eastern Ukraine or invade Iraq etc etc:
There will always be people who more or less genuinely argue that the last invasion of Iraq was necessary or that a certain superpower next to Ukraine has nothing to do with the well supplied rebels in their back yard.
For the rest of us we can all agree to despise both of these.
I think hacker news should be above inflammatory posts like these.
I posted something I was sure most HN-ers could agree on, - but as someone who frequently disagrees with many of you I probably should have known better.
Some people are scared of that and don't do the forbidden thing.
Take the act of stealing for example. It is ridiculously easy to do and indeed it is committed a lot. Yet most societies prefer to ban it, hoping that fear of sentence will prevent some people to do it.
ISIS has actually already used (remote controlled) quadcopters to conduct grenade strikes.
You can make a distinction between human-in-the-loop weapons versus human-on-the-loop supervision versus robots killing with no supervision. But of course landmines have been completely autonomous weapons since the US civil war. It's a place where it's really hard to make bright line distinctions.
It is not a bad comparison. I can even imagine a robot/drone sent it with orders to kill everyone in a certain area. It gets lost or whatever and runs out of power. Years later some kid finds it, charges it up and turns it on...
I think USA has huge pile of banned weapons like bio and chemical weapons.
The USSR put anthrax warheads on a lot of their ICBMs, mostly because it was cheaper and they weren't as enamored with the idea that it might be possible to win a nuclear war by destroying one's opponent's weapons in a first strike, something bio weapons are useless for.
This is not necessarily true. We are disposing of them and last time I checked there are only two sites now that have weapons and their numbers are decreasing daily.
Sites that have since disposed of all weapons that I know about.
1. Johnston Island
2. Umatilla, Oregon
3. Little Rock, Arkansas
source: my father has been doing this for 29 years
Hypothetically being able to digitize and compile it would allow incineration of every bioweapon but it comes with its own dilemma. The one advantage is that if it comes out of bits and into pathogens it was definitely deliberate. The disadvantage of course is that it can be printed with the right equipment.
Bans aren't merely about regulating technology, they're about regulating human behavior by introducing consequences. They're about a society making a choice to introduce legal sanctions against persons who do something unambiguously reckless and destructive, such as handing an autonomous device the power to kill.
Responses that boil down to a shrug of the shoulders underscore an important observation: enthusiasts for technology are precisely the wrong people to ask about the importance of recklessness and risk. It's on par with asking someone in software sales for a warts-and-all perspective on their product.
There is a huge qualitative difference between "anyone could put a gun on a quadcopter" and "everyone can buy a quadcopter with a gun on it".
Agree. These are exactly the enormously important subtleties that are lost when education omits training in the humanities.
Killbots built by nation-states (essentially) are already available for purchase. How does an autonomous helicopter drone armed with an AK-47 sound? ;-)
>The Blowfish A2 “autonomously performs complex combat missions, including fixed-point timing detection and fixed-range reconnaissance, and targeted precision strikes.”
>Depending on customer preferences, Chinese military drone manufacturer Ziyan offers to equip Blowfish A2 with either missiles or machine guns.
>Mr Allen wrote: “Though many current generation drones are primarily remotely operated, Chinese officials generally expect drones and military robotics to feature ever more extensive AI and autonomous capabilities in the future.
>“Chinese weapons manufacturers already are selling armed drones with significant amounts of combat autonomy.”
>China is also interested in using AI for military command decision-making.
China and Russia will ignore any treaties on this, just as Russia has already ignored many including the INF Treaty. As in many other cases, the West will essentially tie its own hands while allowing others to literally get away with murder.
Anyone care to explain?
It should also be mentioned that the drone I linked to above was being shown at an arms show, and is essentially being offered for sale to all comers. I'm sure there'll be healthy sales to some of the more autocratic/aggressive nations out there...
I believe the scientists in question would not be calling for this kind of ban if their sons and daughters were sent into battle against nations equipped with such weapons - in fact, they would be cooperating to develop this very technology if it meant protecting their loved ones.
It's easy to be an idealist, until you've truly got something to lose. I think a lot of people need to be reminded of this.
Do you think it should be legal for citizens to carry RPGs, machine guns, grenades, ..., so they never have to go in to a room in which someone else might have more firepower?
And yes, I think most people agree it should be legal for soldiers to carry machine guns and grenades when in warzones during wars.
The point the GP made was seemingly that we shouldn't let potential enemies have better weapons (autonomous drones), denying them for ourselves.
Yes, I assumed they were considering the context of state sanctioned aggression.
I wondered if they only found that reasoning appropriate for states, or in general.
One could ask "should we always seek for the most terrible weapons to be legal in any context in order to ensure no enemy can outmatch us due to restraints of law?".
In the civilian context, enforcement agencies in most first-world countries are able to restrict weapons and violence reasonably well - most people won't see the need to arm themselves and will prefer to be law abiding. But in countries rife with violence where law enforcement is weak or corrupt, I imagine people would try to arm themselves with what they can regardless legal constraints.
I don't think GP was making an argument that I should carry an AK-47 on me so that any muggers i come across are less likely to out-arm me.
Also, the distances between different countries vs the spread of information is also vastly different. It is harder to commit an international act of violence with no witnesses than it is an interpersonal one, and the threat of retaliation keeps violence at bay, for the most part at least.
Why do people always say ‘nation state’ in these situations? It’s not a fancy word for ‘major country’ - it means something specific. For example the UK is definitely not a nation state. The US is arguably not a nation state either, due to its cultural and lingual heterogeneity and tribal sovereignty.
Jargon nitpicking is my second-least-favourite kind of HN nitpicking, coming in close ahead of "Boy there was an advert on that article!/Why U no adblock?" nitpicking.
A nation state is a nation that is also a state. The UK clearly is a union of four nations into one kingdom. I mean - the clue's in the name.
Additionally "nation" can refer to the collective people rather than a centralized government in action.
Unless you're going to separate out historic county kingdoms (Wessex, Sussex, Kent, Gwent, etc.) then England annexed the counties that compromise the region we now call Wales as lands of the English Crown.
The UK was initially formed by the Union of two kingdoms, England and Scotland at which time all the parts of Great Britain were under the same Monarch.
Your contention doesn't necessarily disagree with this historical fact.
It's "the United Kingdom of Great Britain and Northern Ireland", not the "United Kingdoms of Great Britain, Northern Ireland, and whatever 3rd one you have in mind"
I was under the seemingly false impression that the Queen was separately Queen of England and Scotland still/again. But it does appear she is Queen "of Great Britain, Ireland and the British Dominions beyond the Seas" making her Queen of 2 kingdoms (Great Britain, Ulster) within the UK?
Dictionary result for nation state
a sovereign state of which most of the citizens or subjects are united also by factors which define a nation, such as language or common descent.
so you can call anything a nation state as long as that thing issues a passport i guess?
What? No - you just gave the argument against that!
> are united also by factors which define a nation, such as language or common descent
This means the UK isn't a nation state - we have four nations in the UK, with different cultures, descents, and in some areas even languages.
It also means that the US is not a nation state - the people there have very distinct descents. Today Americans literally still say they're 'Italian' rather than 'American'. Many Americans don't speak English. The US literally describes tribal lands as 'dependent nations', so it isn't one nation even by their own federal definition.
Examples of nation states are places like Iceland and Portugal. These are the major military players that the original comment was trying to refer to.
MAGA-style nationalism is quite popular and powerful at the moment...
Aren't we already there thought? Don't the US already have semi-autonomous flying kill-driods? The ones that occasional crash Pakistani weddings unannounced.
For more targeted strikes with conventional explosives, we started using Tomahawk Cruise Missiles in 1991 during the Gulf War.
Both of what you mention attack a predetermined target - a target selected by humans.
This technology is about "smart weapons" going out and searching for targets. For instance, the Chinese helicopter drone I mentioned in another post could easily be fitted with an IR sensor, and could probably hit targets out to ~200 meters with the AK.
One challenge is IFF (Identification Friend or Foe), but if you just want the drone(s) to kill all "enemies" within a certain boundary that'd be easy. AK ammo is even cheap, unlike the $500K missiles we often use to blow up individual jihadis...
And of course, at any time a human could potentially take control or at least monitor activity.
Assuming that all nations follow. Otherwise one rogue player can have a huge advantage, and that's the general problem with trying to ban military technologies.
Why? I read the article, but I'm not buying the "would increase conflict" angle. Some countries are in constant conflict anyway. It's not like the reason they're not fighting more is because they have a lack of men. A lack of equipment is much more likely to be the limiting factor.
Swarm warfare is going to be cheap and interesting.
That's why the US can spend a quarter million on IUD proof humvees, and the insurgence can just put pointed caps on them to pierce the armor.
It's a lot cheaper to fuck shit up than it is to defend against it.
Tesla and Uber, you mean. Unless there's been an incident with Waymo that I'm not aware of?
A bit unrelated, but I couldn't ignore it. Could you expand on this one? In what sense is Israel a rulebreaker? What are the consequences?
I don't read the news very often, but I do remember there are some actors that are seemingly working towards obtaining nuclear weapons, and their leaders publicly claim they want to harm other countries.
To be sure I'm not totally detached from reality I googled a bit:
They're probably breaking some nuclear non proliferation treaty put into place by a bunch of nuclear powers. (Whether there's anything wrong with the people with sticks saying no sticks is a different matter).
Israel definitely has nukes already, they're not working towards it.
Depends on the country
There's no such thing as purely defensive weapon. I don't remember who said it, but I recall a quote that went similar to: "if your weapon can shoot enemy planes over your cities, it can just as easily shoot enemy planes over their cities".
Or put your city hall on a trailer truck, if that's what it takes.
"includes essentially the ability to find targets within a certain area (such as those near friendly forces), and to self-destruct if it is unable to find a target within the designated area."
When does something become fully autonomous rather than temporarily autonomous?
This would deter some researchers working on general topics (easy to label / threaten as law breakers) while having no impact. Organized crime has bigger laws they already broke and already have enough COTS parts to build nasty things (and are often OK with 80% reliability, thus need no cutting edge research). Many / most militaries would ignore such bans and fund development quietly, justifying this by national security.
IMO the only way to make this approach tractable is to choose a narrow list of technologies we do not want developed and see if there are technologies that exist or can be developed that would effectively detect violations. And choosing even a narrow list of technologies to ban would make for lots of debates and unhappy citizens. My 2c.
Why is a nuclear bomb different to a full scale carpet bombing campaign. Why is mustard gas different to a fragment grenade? Why is killing ok, but torture is not? Why is a barely discriminate bombing campaign legal but a subway bombing illegal?
Rules of war always encountered these arguments. But, do we want to eliminate them, just so things are philosophically consistent?
Imagine an AI in such a scenario. Count me as one of those people who thinks we are headed into a kind of hell.
Maybe a better analogy than a cruise missile would be a land mine. A device that kills, possibly much later than when it's deployed, and often an unintended target.
Automated sentry guns on the Korean DMZ have existed since around 2003, developed by a company that was associated with Samsung (not sure if it still is).
Therefore this isn't, quote, "at best, a few years away". Unless that quote was from around the year 2000.
No situation is totally new, though sometimes the equilibrium shifts so much that the top 3 driving factors in an environment are qualitatively different
also i'm reminded of battlestar galactica and how deeply distrustful they are of computers and tech because of everything that has happened to them.
Science fiction fans relate more to human beings than to silicon chips.
On a more serious note, where do unmanned turrets and gun platforms fall? They're not making the kill/no kill decision (they're always in kill mode) and don't have a human directly at the trigger.
A radio control drone bomb is equivalent to a gun with a really long range. An autonomous drone bomb is equivalent to a gun with a motion detector. It's not about the power of the weapon, it's about the relinquishing of human control and accountability. This is why landmines are so evil, and why they are also trying to ban lethal autonomous robots. It's not about capability of killing at a distance, it's about who pulls the trigger.
Additionally, in a quarter of the USA for a quarter of the year driving conditions are such (permanent snow cover) that there are no visual indicators of where to drive or park. Cars would need all roads pre-mapped with ground penetrating radar at the least (a giant data set). And even with that it would be difficult. I know this because it's difficult for me as a human and I often have to rely on my contextual knowledge of how human society works and how people behave to know where to drive on a surface, avoid getting in crashes, and figure out how and where to park. None of this is possible for current or near term computers.
 - https://en.wikipedia.org/wiki/Trolley_problem
I mean, the sort of tech they are warning against can be put together by lone operatives using commonly accessible electronics and concentional arms.
Why not call for banning mines or cluster munitions? Those remain functional for decades after deployment and kill/maim indiscriminately.
I'm not trying to appeal to whataboutism here, I'm trying to point out the futility.
Perhaps I'm too jaded, but I can't see any scenario where military robotics can be prevented.
I think the problem is main countries won't see the massive deaths as a problem for them but an assets and the race is on to develope the best.
To stop this essentially we need to get US, China and Russia to all agree to ban these and the world will mostly follow..but in todays political environment I can take see that happening.
There's a reason nuclear nations (definition to include non-nuclear nations that are sufficiently good friends with nuclear nations) don't get in fights with each other.
I don't think "opposition" in the sense of an organized political movement was as much of a feature of life back then. You did whatever you could to win, and there was no "international community" to tut at you for it.
Such bans will always end up revolving around a definition game of sorts.
The only thing that changed is - now everybody can make it's own lethal robot.
I don't see why guns/explosives on robots should be treated differently than explosives/guns without robots.
With modern robots you can deploy a killbot into some area and they will autonomously decide who to kill and who not to kill with utmost precision beyond what is possible with rockets (which are a general attack on a sizable area).
That wasn't true even before rockets. Mines are one example of a weapon where the decision to kill isn't made by a human, but by the device (even if the decision tree was as simple as "if weight>threshold: kill").
There are even anti-tank rockets you fire when you don't even see if there is a target, just suspect it. Then, when it flies over a hill and sees a target - it locks on and destroys it.
There are also bombs which spread small explosives over a large area, each of these small explosives has passive control surfaces and try to target the closest armored vehicle under them.
So that you only decide to deploy the bomb in some area, and the bombs decide which target to choose.
You'd need to classify their ability to pull your arms off as a 'gun'.
At the very least, they count as "arms," just autonomous arms, which makes them necessary for a free state.
Let's not twist the Founder's intent here, or underestimate their foresight. They thought very carefully about these issues, and understood that a government's monopoly on violence going unchecked by a disarmed populace inevitably leads to tyranny.
Therefore, we must deploy an autonomous militia of killer robots programmed to enforce an Originalist interpretation of the Constitution with ruthless efficiency as soon as possible. The cause of Liberty demands no less.
>If people agree the Constitution should be enforced in this way, what are the necessary legislative, executive or judicial procedures to enable it?
There are none. The Second Amendment already allows it, and provides the means to enable it - we build an army of killer robots and use them against the government should it try to abridge our right to deploy an army of killer robots against them.
If Ben Franklin were alive, I think he would fully support it.
I actually thought you were serious.
So may it'd depend how smart the killer robot is and how capable they are of holding a conversation as well as attacking things.
Bans are probabilistic, not binary. Anyone violating a ban must weigh the advantage from doing so against the consequences and probability of getting caught. A high degree of revulsion and a consistent track record of enforcement greatly increase consequences and probabilities, and make a ban more effective. Just shrugging your shoulders and saying it's not always possible is a sure way to increase the probability of unwanted action.
If it were to be updated, nearly 110 years later, many works would be left out (like, nearly all of the middle French plays) and MUCH would be included. Without a doubt, one of the books to be included would be The Making of the Atomic Bomb by Richard Rhodes, winner of the 1987 Pulitzer. This book goes from start to finish on how the A-bomb was made, the major and minor players, and their thoughts about the bomb all along the way. Rhodes takes us into the heads of the physicists and clears out all the calculus and gets to the human parts of the endeavor.
All the players in the race for the nuke knew the bomb was unavoidable for mankind. Leo Szilard was the first person to really conceive of the bomb while waiting for the stoplight to change where Southampton Row passes Russell Square, across from the British Museum in Bloomsbury.
There were thoughts, groups, and people that entertained the thought that they could form a group of 'priest-scientists' that would keep the uranium safe from politicians and warlords. Keep the power flowing but stop the chain reaction from going critical and taking out billions of people. But as the race progressed, all the real players for the nuke knew that such a utopia was impossible. That this particular genie was out of the bottle the second that Leo, or any other physicist, had made it across Southampton Row.
These robots are a microcosm of the same issues that the atomic physicists faced nearly 100 years ago. They know the damage that the ideas will have upon us all, they know that the semi-bucolic world we now live in will fall away to violence, and they know that they can't stop it. But, bless them, they are trying to sound the alarms and maybe, just maybe, change the minds of some of the politicians and warlords.
Looking back at the first Gilded age, with all their knowledge, they had no templates for the power of The Bomb and finally facing the mortality of all humans. Fortunately, we do have the templates, at least in terms of the use of lethal robots. We know how this plays out, we know how to make an atomic bomb, and we programmers need to learn from the physicists of the '30s and '40s.
IMHO hoping that others won't go there is not a plan because ultimately somebody will. We need credible defensive capability against this stuff. Given the adversary is going to be autonomous AIs with lightning fast responses, having humans in the loop when defending means being defenseless effectively.
Most current wars are asymmetric guerrilla wars where there is a powerful but reluctant to engage party and some highly motivated individuals fighting an unwinnable fight. E.g. the conflicts in the middle east are basically premised on men with primitive weaponry moving around the country trying to stay hidden from air surveillance, satellites, or simply hiding among civilians. Countries like the US are reluctant to go in and fight on the ground because things get ugly in terms of casualties and 'collateral' damage. You can bet drones will be popular on both sides in such conflicts.
Drone based warfare would make that a lot more one sided than it already is and would probably make this type of warfare a lot less attractive and potentially put an end to a lot of long running conflicts. This does not have to turn into the cliche dystopian mess. For that look no further than the current state of affairs in e.g. Congo, Yemen, Afghanistan, or Syria.
War is not about chivalry but about winning at any cost. When one side is fighting with their hands behind their backs thus preventing them from winning and the other side can't win, you have a stalemate. It used to be the case that the winner would execute/enslave surviving enemies. Brutal but effective. When the war was over, there was little chance of a comeback by the other side. These days we do a lot of damage to each other but it is rarely decisive. WWII was one of the last conflicts where the other side literally had no choice but to surrender unconditionally. I live in Berlin. The effects of that war are very visible still. It's also a very peaceful city these days. That war was really over when it was over.
Wars between nation states are subject to all sorts of international rules. The problem is that most wars these days do not necessarily involve nation states and nation states instead engage each other via proxy wars. E.g. the US, Iran, and Russia are not formally at war but actively engaging each other in a plausibly deniable way nevertheless. These wars are dirty, brutal, and cause a lot of misery precisely because they rarely are fought to a conclusion and fought using any means possible.
Their purpose was to set up conflict through an apparent paradox (robots can't harm humans but then robots harm humans) by showing the Laws failing spectacularly and exposing humanity's hubris in the face of unintended consequences.
Here's a big, long HN thread about it.
That seems to be an appeal to authority. This group is making an ethical argument, not a scientific one.
Which probably why without “supervision or meaningful human control” is their criteria.
Is a security guard with 20 camera controlled turrets using facial recognition to choose targets "supervised" ?
The title doesn't say "science calls for."
I know I'm interested in hearing what a broad group of scientists think about non scientific questions.
If the guard is necessary to pull the trigger, then it's supervised. If the ADS has a list of approved faces, or is told not to shoot people with an IR tag, then you get into a fun grey zone
First, special forces are having a hard time recruiting and the reality, it's not just them .
Secondly, why not purely for defensive purposes? Want to eliminate the effectiveness of nuclear weapons? If we can have self landing rockets. Why not a missile that breaks into multiple rockets in space and destroys all nukes whilst they are inbound?
Thirdly, why should billionaires and those who can afford them have their own private security? Imagine having your own inside the home? 24 hour protection. Only calling the Police to effectively clean up the situation? Could be a new startup right there.
Finally, we should be encouraging darpa and the military to be spending billions on robots. That technology will filter down eventually to the consumer. That's better for all of us.
Has it ever occurred to you that banning killer robots not just for the poor and the public sector might be on the table?