Hacker News new | past | comments | ask | show | jobs | submit login

My concern about autonomous robot warfare is not that it will shoot the wrong people or go out of control; that's easily answered by "but this one won't", and there's no counter-argument which will not look like pure assertion.

My concern is that they will work just fine: this would bring the domestic costs of warfare to near-zero (the cost of the robot weapons). For those of us that are sceptical of military adventurism, just consider how much more willing our politicians would be to invade some random, relatively harmless country if there were no "boots on the ground" and no possibility of body bags coming home.




> My concern is that they will work just fine: this would bring the domestic costs of warfare to near-zero

As an adjunct viewpoint to this - what happens when such weapons are available to regular people?

In theory, that's possible today; witness the number of instructables and other similar projects detailing how to create a nerf or paintball automated tracking gun system. It wouldn't take much to upgrade that to mount it on a decent mobile platform with a real weapon.

Right now, the objective or use-case for such a device isn't clear, so we haven't seen criminals build or use them, beyond perhaps drug trafficking via consumer drones and similar.

I'm sure, though, that it is going to be a problem in the future...


I'm pretty sure that some criminals will use such devices in the future.

I'm also pretty sure that criminal use is going to be rare.

There doesn't seem to be a very large overlap between willingness to commit violent crime and the technical competency to turn off-the-shelf consumer products into weapons. Why do terrorists stab people with knives when they could make explosives from household materials instead? They're clearly violent and unafraid of death. My interpretation: they lack the necessary knowledge to make explosives, aren't capable of self-learning either, and settle for a vastly inferior but readily available knife. Even organized crime seems to use explosives rather rarely, and even then diverted/stolen commercial ones more often than DIY.


> There doesn't seem to be a very large overlap between willingness to commit violent crime and the technical competency to turn off-the-shelf consumer products into weapons.

Good insight, which I will tack my own thought onto: the kind of person that would be successful at this would probably have the right mindset to work in an engineering job, and people who do that are both earning enough that they have no need to turn to crime, and too busy to be a political or religious extremist (something about idle hands and the devil's work...)

Still. Stabbing and other unsophisticated attacks seems to be relatively recent and primarily a feature of Islamic terrorism. If I am correct, then...

> My interpretation: they lack the necessary knowledge to make explosives, aren't capable of self-learning either, and settle for a vastly inferior but readily available knife.

...I have an alternative, perhaps entirely wrong interpretation: Islamic terrorism typically requires that its most skilled and most dedicated operatives kill themselves in the course of their first and only attack. Which is to say, they're running out of good operatives.

Meh, perhaps your original interpretation is better :)


I think you're exactly right.

A key check on military abuse in a democracy is that military actions kill some fraction of voters. We have already seen that as that fraction goes down through better soldier protection technology, our willingness to go into battle increases. If we drive that down to near-zero, the implications are frightening.

A further consequence of this is that the fewer human soldiers you have in the field, the fewer embedded journalists you have. And, without those, it is even easier to conduct military actions without any oversight from voters.


The long, relative peace on the Korean peninsula has persisted despite a massive manpower advantage by the North. The presence of hard defensive positions augmented by landmines and autonomous sentry guns allows South Korea to deter Northern aggression without having to keep up with North Korea's punishing and impoverishing level of military spending.

Like any weapon, robotic weapons are only a problem if you use them for evil.


>The presence of hard defensive positions augmented by landmines and autonomous sentry guns allows South Korea to deter Northern aggression without having to keep up with North Korea's punishing and impoverishing level of military spending.

And the presence of plenty of artillery pointed at Seoul has prevented anyone who doesn't want to see Seoul leveled as a result of their actions from taking aggressive action against North Korea. Knowing that if you do anything bad you will get whacked hard enough that it's not worth it is a powerful deterrent.

>Like any weapon, robotic weapons are only a problem if you use them for evil.

Because of the inherent defensive bias of the technology (stationary is easier than mobile, defending a known area is easier than entering an new area and figuring out what to shoot) robotic weapons are a bigger problem for those seeking to do or enable evil.


> Because of the inherent defensive bias of the technology (stationary is easier than mobile, defending a known area is easier than entering an new area and figuring out what to shoot) robotic weapons are a bigger problem for those seeking to do or enable evil.

This is a crucial point.

Another point is that most potential adversaries don't value the lives of their fighters as much as we value the lives of our military personnel. If half a dozen American troops die, it's national news. If a hundred die, it's a political scandal. People wonder why the US spends so much more on defense than other countries. That's partially because we're willing to spend lots and lots of dollars to save just a few lives.

And, for that reason, I think the US in particular will eventually be willing to replace even its offensive military capacity with robots. And if it works, that means the US or any other technologically advanced state can, at great expense, continue a counterinsurgency campaign indefinitely.

But, as you point out, any defender who can scrap a robot together will have an advantage in countering that ability. Hell, they might even be able to use open-source software to run them. And that's the good news. Orwell wrote, "...tanks, battleships and bombing planes are inherently tyrannical weapons, while rifles, muskets, long-bows and hand-grenades are inherently democratic weapons. A complex weapon makes the strong stronger, while a simple weapon--so long as there is no answer to it--gives claws to the weak."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: