
Harpy, a “Fire and Forget” autonomous weapon drone - nwrk
http://www.iai.co.il/2013/36694-16153-en/Business_Areas_Land.aspx
======
AndrewKemendo
The harpy is a very old HARM and wouldn't be anything close to be considered
autonomous. It's basically what a lay person would call a "homing" missile
that tracks and follows radiation in the form off high power RF aka radar.

Every nation has these and most aare air launched. Not much different than an
IR missile like our AIM-9 in that it's a "fire and forget" system. That is
compared with active guides guided missiles like the AMRAAM.

~~~
MrMember
Small note but the AMRAAM is also a fire and forget missile as the radar
guidance system is contained on the missile itself and requires no additonal
input from the jet that fired it. This is contrasted with a missile like the
AIM-7 that relies on the active radar from the parent aircraft to stay on
target.

~~~
AndrewKemendo
Nope. The primary BVR use case uses the APG-63/5/70 to give guidance to the
AMRAAM until it can pick up active seeker lock itself. In theory if you are
close enough you can "fire and forget" but it would be a waste and put you way
closer to a threat than necessary - which would defeat the purpose of the
AMRAAM.

------
cgb223
For being such a high tech military drone, the promotional video is really
really bad

It's like watching a drivers ed video from the 80s that kills terrorists

~~~
mmPzf
The video surprised me to, so I took a closer look at this weapons system. The
promotional material looks like it came straight out of the eighties because
it does. According to wikipedia[0] this is a late eighties tech. Maybe it'd
make sense to add that somewhere to the submission?

[0]
[https://en.wikipedia.org/wiki/Loitering_Munition](https://en.wikipedia.org/wiki/Loitering_Munition)

------
kayoone
this product and tech is decades old, nothing to see here

------
pmoriarty
Is the source code for this web page obfuscated HTML? Or what is that?

~~~
k_sh
The structure and class names? Probably not, the structure looks kinda
verbose, but not unreasonable if you imagine them using an off-the-shelf
HTML/CSS library for the site.

~~~
pmoriarty
No. This is what I get when I wget
[http://www.iai.co.il/2013/36694-16153-en/Business_Areas_Land...](http://www.iai.co.il/2013/36694-16153-en/Business_Areas_Land.aspx)

[https://paste.pound-
python.org/show/L3Yhf3vJ8rMuiO4OWLOm/](https://paste.pound-
python.org/show/L3Yhf3vJ8rMuiO4OWLOm/)

That looks pretty obfuscated to me.

I'm guessing what you're seeing is the html that's generated from that
somehow. Or maybe it's serving up different html to different clients/IPs?

Maybe someone who's more familiar with web development could explain.

~~~
jszymborski
The page loads fine with javascript disabled, and from what I understand,
view-source: doesn't interpret javascript.

My best guess is that it's picking up the wget user-agent and serving up some
bot counter-measures?

~~~
pmoriarty
It doesn't always do that. I first loaded the page in emacs-w3m, which doesn't
interpret javascript, and got the output I pasted above. Then I went back a
bit later and loaded the page again, and got normal html (while still using
emacs-w3m, a browser that can't handle javascript).

Then I tried wget and got the same weird obfuscated stuff I pasted above, and
got it again when I wgot it again.

------
cup
Lets say this fails and a bunch of kids are blown up, who will be held
responsible?

Importantly, how will we ensure an individual is accountable rather than the
corporation being at fault.

~~~
slededit
The government deploying this in war time is responsible for its actions.
"Corporations" not directly associated with a government actor would be
considered mercenaries with its own set of international law.

~~~
zo7
But who will the deploying government's army hold responsible? If a soldier
recklessly kills several civilians they will likely be held for responsible
for their actions and discharged. If an autonomous robot does the same, then
what will happen?

~~~
slededit
Not that it will make you feel any better, but often the offending soldier is
not held responsible. Ultimately might will make right and the victors will
decide on the punishment.

The "system" is already an excuse. Example the Tarnak Farm Incident where a US
pilot killed 4 Canadian soldiers: "... as much as the F-16 pilots bear final
responsibility for the fratricide incident, there existed other systemic
shortcomings in air coordination and control procedures, as well as mission
planning practices by the tactical flying units, that may have prevented the
accident had they been corrected."[5]

Despite the fact he "flagrantly disregarded a direct order", "exercised a
total lack of basic flight discipline", and "blatantly ignored the applicable
rules of engagement". He was merely fined $5,700.

------
jeisc
Will the next major war see only drones fighting?

~~~
forapurpose
The U.S. military's current approach, the "Third Offset" which is still in
development, is based on the theory that AI is good at defined tasks (e.g.,
identifying enemy planes) but not at handling the chaos of war (e.g.,
everything involved in flying a plane on a mission); that is, it's not good at
general intelligence. AI can park your car, but the other cars aren't
intelligent adversaries who are trying to stop you from parking, destroy your
car, and kill you.

The general design is to combine AIs and humans in what they call "centaurs",
utilizing the strengths of each in a team. At least, that is what is said
publicly.

But autonomy already is deployed. For example, in ships, the last line of
defense against incoming missiles are basically large, autonomous machine
guns; a human could never act quickly enough against supersonic incoming
missiles. Computer system attack and defense ("cyber") is expected to be run
by AI, simply because humans can't keep up with an attacking AI. Autonomy also
is necessary because you can't depend on having secure, effective
communication with your drones; that would be a huge vulnerability.

But autonomy is much trickier to define than it first appears: If you shoot a
'dumb' artillery shell at a target, you lose control of the shell the moment
after you pull the trigger. If a group of children subsequently enters the
target area, that 'autonomous' shell is going to kill them and you can't stop
it. (An AI weapon might be safer in that kind of circumstance; it can change
course.) Is it different to shoot a dumb shell to kill everything in the
target area than to send an AI-operated drone to the same target area with the
same instructions?

~~~
KineticLensman
> If you shoot a 'dumb' artillery shell at a target, you lose control of the
> shell the moment after you pull the trigger. If a group of children
> subsequently enters the target area...

Nitpick: The children would have to move very quickly to get there during the
time of flight of the shell (a few seconds). The interval during which you can
intervene is actually the time from the forward observer (remember, arty
gunners don't see their own targets) making the call for fire to the time at
which the shell is fired. If the notional target is assessed as a high
priority (target priorities are pre-assigned), and you trust the forward
observer to have checked for collaterol damage, this could be a very short
period.

If loitering munitions are used instead of conventional arty, and they can
accept targeting updates in-flight, then the interval could be as long as you
like (within weapon endurance constraints).

[Edit - clarity]

~~~
woodman
> Nitpick: The children would have to move very quickly to get there during
> the time of flight of the shell (a few seconds).

Double nitpick: 155mm artillery rarely has a time of flight of a "few seconds"
\- as they usually call "splash" 5 seconds before the anticipated impact in
order to give the forward observer a heads up. A TOF of a minute isn't unusual
- in that time a lot can happen.

~~~
forapurpose
Don't they have a range of something like 20 miles, which would take much more
than a few seconds? And that is measured in a straight line from the gun to
the target; the shell, of course, takes the long way.

------
jeisc
what does the NG stand for?

~~~
billytetrud
Think Star Trek...

------
whatnotests
How long until private drone manufacturers deploy many drones into areas of
conflict, and grant control of said drones to the highest bidder?

~~~
WJW
That's actually an interesting business idea. Pretty risky though, if you sold
to both sides you'll always also sell to the losing side. The eventual winning
side might not be happy about that. Also, depending on the anti air
capabilities on the battlefield and the price of a drone, the losses might not
make up for the profits.

~~~
KineticLensman
> depending on the anti air capabilities on the battlefield

Yes. The use of drones by western forces has coincided with operations in
which the west has had total air supremacy. The larger weaponised drones
(Predator, etc) have _never_ been flown in a hostile air environment, against
a peer force that has serious anti-air. I would be reluctant to fly my drones
in such an environment. Smaller throw-away drones are a different issue.

~~~
nthcolumn
Smaller drones could deliver small payloads anti-personnel/incendiaries on eg.
London or just as ransomware - a dozen expendable drones could occupy the
airspace over Heathrow until some anoncoin was paid. Sky's the limit
literally! Exciting times!

