Jeff Goldblum: Don't you see the danger inherent in what you're doing here? Killbots will be the most awesome force the planet's ever seen, but you wield it like a kid that's found his dad's gun.
Richard Attenborough: I don't think you're giving us our due credit. Our scientists have done things which nobody's ever done before… Our robots can do parkour and even dance!
Goldblum: Yeah, yeah, but your scientists were so preoccupied with whether or not they could that they didn't stop to think if they should.
Attenborough: What about search and rescue? They're not just for the military, you know…
Goldblum: (shaking his head) No…
Laura Dern: Well, the question is, how can you know every possible use for a new technology? And therefore, how could you ever assume that you can control it? I mean, you have plants in this building that are poisonous, you picked them because they look good, but these are aggressive near-living machines that have no idea what's right from wrong, and they'll defend themselves, violently if necessary.
Attenborough: Sam Neill, if there's one person here who could appreciate what I'm trying to do...
Sam Neill: The world has just changed so radically, and we're all running to catch up. I don't want to jump to any conclusions, but look... Killbots and man have just been suddenly thrown into the mix together. How can we possibly have the slightest idea what to expect?
Attenborough: (laughing) I don't believe it. I don't believe it! You're meant to come down here and defend me against these characters, and the only one I've got on my side is the blood-sucking lawyer!
I just don't see a way around a future with autonomous killing robots. As China ramps up its military rhetoric and technology spending, it's hard to imagine them drawing a line in the sand against using a technology that could give them military superiority. Once they have it, we're going to need it too, because at that point we're already threatened by its existence anyway, so there's really no point in us purposefully keeping ourselves militarily limited relative to China.
Does military superiority on the ground matter for conflict between countries like China and the US? The nuclear option is going to overshadow actual combat, and any combat is likely to take place at sea or with aircraft, if it's not all done as a cyberattack.
In my view the problem for either China or the US using nuclear is still mutually assured destruction - you fire a nuke, and we're going to launch all of ours. If you start using nukes, you better take over the world, because if not, everyone's going to unite against you. Unless your plan is total world domination, using nukes is going to do more harm to a country than it will offer benefits.
The same isn't true with autonomous killer robots, because you can use them in relative secrecy at a small scale. If you fire a nuke, you've immediately escalated to total war. If you take out a strategic target with an autonomous robot, you haven't really pushed the envelope of war that far beyond where it's already at, so I can't justifiably nuke you.
I'd also say it's not just the ground we're talking about here - if we have autonomous killer dog robots, we're doing to have autonomous killer drones no question.
In China they're probably saying right now: "As US continues its extraordinary military and technology spending, it's hard to imagine them drawing a line in the sand against using a technology that could give them military superiority…"
I find it very funny that you frame your opinion around those nefarious easterners dragging the reluctant do-gooders at the DoD and CIA into murder robots, when 60 years ago they were in an agrarian land revolt and the US was literally building rockets with atomic warheads on them.
And as if that's not enough, to post this opinion in the comments linking to an article about the US's murder robots...
Did I call them reluctant do-gooders? I don't see calling them reluctant do-gooders anywhere in my post. I gave unambiguous evidence of China's military escalation in the last couple of years and then used logic to come to a conclusion of what the likely outcome would be.
You mean to say "China's attempts to compete militarily," surely, because no country outside of the US leads the market in terms of more creative ways of killing people.
China has becoming increasingly aggressive in both its rhetoric and actions towards Taiwan, and this comes only shortly after they clearly violated their 50 year agreement with Hong Kong. They are testing increasingly advanced weapons. That's just reality - nothing to do with Cold War thinking.
Maybe China decides not to build autonomous killer robots, but we absolutely can't be sure of that. If they do end up with that technology and we don't have it, it puts us at a huge military disadvantage against an increasingly aggressive superpower.
Making it illegal domestically makes no sense whatsoever if China develops the technology. The risk with autonomous killer robots is in their existence - if something goes wrong, they start deciding to kill the wrong people. If China has them and we don't, that doesn't reduce that risk at all.
But if China has them and we don't it, then we're at risk both of something going wrong with the robots and of China using them against us. At least if we develop our own, we can counter the latter threat.
Contemporary US defense policy is almost entirely based right now on concerns of falling behind China technologically in the next 20 years. So no, we're definitely not over that. We were, but we swung back hard in the other direction, with present leadership convinced that going all-in on asymmetric warfare, counterinsurgency, and foreign internal defense training and assistance was exactly the wrong thing and terrorism is small beans in the grand scheme, and we should have stayed focused on long-term strategic dominance of near-peer adversaries.
And it is illegal for the active military to do anything domestically. Different story for the National Guard, of course, and the police, so I guess we might want to neuter them technologically if there's a real concern the police are going to use murder dog robots to hunt suspects.
I on many occasions ate breakfast with national guard members dressed up as police for their training exercises. Their police uniforms all had high rank, high tenure. They will be in charge of the police when things go sideways.
We haven’t even stopped building new kinds of nuclear weapons - or reviving old ones.
China recently launched a Fractional Orbital Bombardment System (FOB) with a hypersonic glide vehicle which is a new spin on an old Cold War idea.
Edit: which is to say, believing the average person has moved on from Cold War thinking is probably wishful thinking that would be immediately dispelled during near peer conflict. We are in every bit as much danger now as we were then.
Edit 2: I can recommend two podcasts for those interested in a bit of low stakes learning.
“At the Brink”: a limited series from the William J. Perry Foundation.
“Arms Control Wonk Podcast” which covers contemporary nuclear weapons, arms control, etc. in an ongoing manner.
Someone mounted a handgun to a drone a few years back as a DIY project. Honestly: this isn't a big deal - this is a remote control something with a firearm. About as dangerous as a Battlebot (those have had .22 firing guns put on them - not very effective against the armor on those robots).
The big deal here would be the deployment of autonomous target acquisition: contrary to popular belief nobody deploys video game style automated turrets, though interestingly the use of "loiter-capable" missiles has gone largely unremarked upon even though by definition these are autonomous robots which attack human-occupied targets without direction.
The big problem is that there's a strong incentive to make UAVs autonomous as doing so would mitigate the risk from jamming to a control signal. Downside is that now we're putting control over weapons into the hands of a (likely) poorly understood amalgamation of algorithms. I'd see this happening with aerial UAVs before the robo-dog, however.
I am very happy that 9/11 happened when drones and robots weren’t ready for military use. If they had been available I bet the administration back then would have deployed them to any country they may have perceived as a threat.
Wars are in a sense self regulating. When the human losses get too much, most politicians will reconsider. Happened in Vietnam and also in Afghanistan for the Russians and later the US .
With robots this may change. If you lose only robots it’s very tempting to keep wars going with horrendous results for the civilian population in the occupied country while the costs for the occupier are only financial .
Playing devil's advocate here: isn't this just another step in the progression of warfare? Do you think people questioned the shear destruction that could be dealt using gun-powder when that was first used in anger? Same again with tanks, and guided missiles. I'm sure questions were asked, but then if your enemy is going to be using the technology, you'd better be using it as well.
Biological warfare scares me far more, and I don't think we've seen anywhere near the potential for that in any way yet.
The threat isn't the addition of guns, it's the removal of conscience, reason, and consequence. Bioweapons, nukes, whatever - they require a person to address their own conscience before utilizing them. Killdogs just address their programming.
I think there will still be a human who deploys a killer robot, until AGI that is.... Also I don't think someone deploying land mines is thinking about the consequences beyond the immediate conflict, and they will kill/maim years after.
We are moving into the Afghanistan 'we don't know who's an insurgent so drone bomb anyone holding a radio' stage of warfare and humans are not ready for the ease and lack of accountability that comes with it.
We're reading this from cozy place, thinking wow cool/scary, and 5 minutes later we'll forget about it.
Meanwhile, 5 years later some civilian in area of conflict being hunted by these robots, while at that same time the rest of us worrying what movie to watch next.
Flying cars, self-driving cars, robot dogs, gun-carrying dogs. We truly living the life of what used to be sci-fi. I'm not sure I like where it's headed though.
Other way around. The Black Mirror episode was inspired by the early experiments with quadrupedal robots, especially those first terrifying Boston Dynamics videos. Like the one where the dude viciously kicks over the Big Dog and it relentlessly gets back up, and everyone's thinking "oh man, that guy is so dead as soon as these things get a brain"
Richard Attenborough: I don't think you're giving us our due credit. Our scientists have done things which nobody's ever done before… Our robots can do parkour and even dance!
Goldblum: Yeah, yeah, but your scientists were so preoccupied with whether or not they could that they didn't stop to think if they should.
Attenborough: What about search and rescue? They're not just for the military, you know…
Goldblum: (shaking his head) No…
Laura Dern: Well, the question is, how can you know every possible use for a new technology? And therefore, how could you ever assume that you can control it? I mean, you have plants in this building that are poisonous, you picked them because they look good, but these are aggressive near-living machines that have no idea what's right from wrong, and they'll defend themselves, violently if necessary.
Attenborough: Sam Neill, if there's one person here who could appreciate what I'm trying to do...
Sam Neill: The world has just changed so radically, and we're all running to catch up. I don't want to jump to any conclusions, but look... Killbots and man have just been suddenly thrown into the mix together. How can we possibly have the slightest idea what to expect?
Attenborough: (laughing) I don't believe it. I don't believe it! You're meant to come down here and defend me against these characters, and the only one I've got on my side is the blood-sucking lawyer!
Goldblum: Power, uh, finds a way… to oppress.