
Israel deploys automated military robots - sjreese
http://mainichi.jp/english/articles/20160824/p2a/00m/0na/020000c
======
josh_fyi
Not fully automated. These may be autonomously driving, but not autonomously
shooting. The article hints that that is technically possible, but also states
that it is not implemented.

~~~
scandox
Can't we just get a kind of programmer's Hippocratic oath going here? Let's
just all not facilitate autonomous shooting. I know both sides of the
argument. I know it won't prevent it being done. I just mean if enough people
agreed not to do it, then we'd at least generate enough disapproval for it not
to be the subject of any PR Puff.

~~~
FBT
Isn't it better to have robots fighting and dying than humans?

War sucks, and I do wish that we could have world peace. But if war is going
to happen, I'd rather it be robots that are the ones in harm's way.

~~~
zorked
That's not what's going to happen. One side will have robots, the other will
have humans.

This is already happening.

~~~
FBT
That's better than both sides being humans.

Seriously. Incremental improvements are worthwhile.

In general, militaries never want to comprise their effectiveness. So if given
the chance to implement a life-saving intervention that harms their
effectiveness, they will be reluctant at best to implement it. But _this_ is a
life-saving intervention that possibly even increases their effectiveness!

That's what truly makes it awesome. A way to save lives that militaries are
not only willing to implement, but rather are actually enthusiastic about
implementing.

~~~
origami777
That's a best case scenario. What it could do is make military intervention an
even easier decision. The loss of human life is a deterrent to engaging in
war. If one side isn't concerned about that, wouldn't they be more inclined to
go to war?

~~~
slv77
Conversely removing the threat to human life may reduce the probability of
escalation and make a response more measured.

For example two men break through a security fence. A robotic patrol would
seem to be much less likely to respond with deadly force than a human patrol
would in that situation.

If the men were refugees or wounded soilders looking for medical care the more
measured response of the robotic patrol may prevent a death and actually de-
escalate conflict.

Also assume that the two men were hostile and fired on the patrol. The
destruction of the mech is much less likely to provoke a political and
civilian response than the death of a human and may lead to a more measured
response.

My bigger concern is that a fully mechanized force may decrease the apparent
cost of oppressing a civilian population.

------
andy_ppp
This classic scene from Robocop seems apt...

[https://www.youtube.com/watch?v=_mqDjcGgE5I](https://www.youtube.com/watch?v=_mqDjcGgE5I)

------
MrLeftHand
It looks like we don't have to wait for SkyNET to produce an army of robots
and autonomous vehicles with guns.

We make them ourselves, against ourselves.

Can't wait to see someone hacking these machines and turn them against their
former owners.

------
curiousgal
I would be more impressed if their opponent were more technically
sophisticated.

You have people pretending to build a tank on one hand and people with
automated robots on the other? Seems a bit overkill and pointless IMHO.

~~~
scandox
"Overkill" is the defining feature of Western military. We've got to the point
where we consider a ratio of 1 (of ours) : 1000 (of theirs) unacceptable. It's
very hard to present an argument that says we should accept more deaths on our
own side in order to be "fairer" to the other side. However, the net effect is
that we continue to minimize our risk in ways that de-humanize conflict and
increase collateral damage.

~~~
douche
I'm not sure we are really increasing collateral damage, relative to where we
started 100 years ago. The days of firing 1-3 million heavy artillery shells
in a day[1] are past, as are the days of dropping tens of thousands of tons of
bombs indiscriminantly on a city[2]. Blowing up a single house with a drone-
launched missile, or taking out a bridge with a single laser-guided bomb is
far more surgical. One can imagine in the not-too-distant future bug-sized
drones that could pick a target out of a crowd and inject a trace amount of
some very deadly toxin or sedative.

[1] [http://www.longlongtrail.co.uk/battles/battles-of-the-
wester...](http://www.longlongtrail.co.uk/battles/battles-of-the-western-
front-in-france-and-flanders/the-battles-of-the-somme-1916/british-artillery-
bombardment-before-the-infantry-attack-on-the-somme/)

[2]
[https://en.wikipedia.org/wiki/Bombing_of_Hamburg_in_World_Wa...](https://en.wikipedia.org/wiki/Bombing_of_Hamburg_in_World_War_II)

~~~
MrLeftHand
It's true we are making less collateral damage then in the golden days, but
the problem is that the more advanced one side becomes the less of a chance
the other side has.

It's not just about fighting terrorists. We don't know who will be in command
next. Giving this tech into the wrong hands will do some massive damages.
There is always a chance to find ourselves on the wrong end of the drone.

Also the drone targets a single house, bombs it to oblivion and then someone
says they are 80% sure, they bombed the right house with the right people in
them. Not to mention a lot of times the people killed are civilians who were
actually supportive, but because they are dead now, we just make the rest
angry and make them side with the enemy.

We are a long way away making sure we are killing the right target.

~~~
googletazer
" We don't know who will be in command next." Thats the real issue. Drone
armies is every dictator's wet dream, an unstoppable mass-produced force.

------
98Windows
The scary thing is once these things get a bit better it'll become cost
effective to build thousands of them and just zerg rush your enemies.

------
nivertech
A brainwashed suicide bomber is not much different from an autonomous bio-
robot.

Indoctrination [1] doesn't let you question or criticize your programming,
unlike modified Asimov's Laws [2].

[1]
[https://en.wikipedia.org/wiki/Indoctrination](https://en.wikipedia.org/wiki/Indoctrination)

[2]
[https://en.wikipedia.org/wiki/Three_Laws_of_Robotics](https://en.wikipedia.org/wiki/Three_Laws_of_Robotics)

------
tim333
A potential advantage or military robots is you could make them non or less
fatal. In Aleppo type situations robots might be able to taser and arrest
people instead of trying to kill them.

------
erlich
I wonder what the cost differential between The Trump Wall, and autonomous
vehicle patrols would be.

Electric vehicles with solar charging would be very cost-effective as a border
security mechanism.

Where are all the border security startups? Lots of government cash on the
table I would think.

~~~
MrLeftHand
Well, a wall for start wont shoot you in the face just because you want to
cross a border.

A wall, doesn't need electricity.

The maintenance cost is lower.

Can't be hacked.

Oh, and you can paint nice big pictures on it.

------
crdoconnor
10 Initiate provocations in Gaza

20 Run live tests on technology

30 Use the proof of real life usage in a low boil conflict to sell the
technology at international arms fairs.

40 Use realistic test conditions to refine & develop new technology for 2
years.

50 GOTO 10

This efficient QA/development cycle is a serious competitive advantage, since
it's hard for other countries to replicate the same controlled conflict
conditions. Hence these great sales figures:
[https://rwer.wordpress.com/2016/01/12/the-15-largest-arms-
ex...](https://rwer.wordpress.com/2016/01/12/the-15-largest-arms-exporters-
per-capita/)

And it's been ~2 years since the last conflict, so...:
[http://europe.newsweek.com/us-calls-americans-leave-gaza-
soo...](http://europe.newsweek.com/us-calls-americans-leave-gaza-soon-
possible-492974?rm=eu)

