

Automated to Death? - pieceofpeace
http://spectrum.ieee.org/computing/software/automated-to-death?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+IeeeSpectrum+%28IEEE+Spectrum%29

======
Dav3xor
I work on avionics. It's a great job, except you have no idea when or how
things can go drastically wrong. We had a bug in our weather data parser that
didn't surface in 2 years of constant testing. The weather receiver is only
supposed to give data for the continental United States, but we occasionally
would get a Temporary Flight Restriction for Iraq, which worked fine.

One day, we got a report from a customer that his panels rebooted twice, in
the clouds. All signs were pointing to the weather receiver. I feverishly
looked through the code and eventually found the problem.

We were getting a TFR for an air show.. in Guam. I knew that we handled the
eastern hemisphere properly, the Iraqi TFR's worked fine.. The data looked
something like "....151.235E10.1235N...".. We eventually determined that the
floating point value parser was treating the E as scientific notation and
blowing up. It worked fine on my PC (glibc), but not on the device (newlib).

~~~
jodrellblank
So what happened? In "2 years of constant testing", you ... didn't test it on
the device? Didn't try codes outside the area it was supposed to work in? Is
there a weather data standard and you didn't meet it, or is there one but it's
not always followed, or is there none and you have to do as well as you can?

~~~
Dav3xor
A little bit of all those things -- it worked just fine, on the device, for 2
years, including the occasional eastern hemisphere TFR. It only happened with
a TFR that was close to the international date line which made the ascii ->
float conversion run off the end of the string. I admit I didn't test that
eventuality -- but I was using a C library function that's been in constant
use for over 30 years.

There's no real weather data standard, XM Weather is a collection of formats
(nexrad radar comes over as a bitmap, weather reports are METAR encoded).

The Temporary Flight Restrictions are in XML, well, not really XML, it's a
really nasty format designed by Jeppesen were you basically have to write your
own XML parser because they made it impossible to parse with a standard XML
parser.

It just goes to show that even when you test the hell out of something, you
can still get blind sided by something you never thought of, but once it
happens, it seems really obvious.

Remember the problem the F-22 had going across the international date line --
that code was a lot less tested than mine in that regard -- but they just
didn't think about it.

[http://www.defenseindustrydaily.com/f22-squadron-shot-
down-b...](http://www.defenseindustrydaily.com/f22-squadron-shot-down-by-the-
international-date-line-03087/)

------
pmichaud
We're going to see more and more of these articles popping up as cars become
more automated.

To me it doesn't matter what the rate of failure is on automated systems, as
long as the rate is below human failure rate. In that case, we win. It also
doesn't matter whether humans can recover given a failure, again given that
the failure rate is already below the average human failure rate.

The problem is that humans are really hung up on the illusion of control. We
know that in a car that drives it self that we're not in control and the
prospect of that machine killing us without our intervention scares us. What
we're not so sharp on is that by taking control we are [hypothetically]
increasing the chances of our injury or death, and in a field of all manual or
all auto cars, manual control doesn't even favor skilled drivers since the
real danger for above average skilled/careful drivers is the /other/ drivers
on the road.

~~~
khafra
It's the Dunning-Kruger effect: Sure, if this machine's failure rate is below
the human average, it'll be safer for average humans. But I'm better off in
control, because I'm a far better than average driver--aren't you, and almost
everyone you know?

~~~
mmt
I don't think this is more than just superficially true for driving.

As another commentor pointed out, there are (at least) two causes of "better,"
one being more skilled/practiced and the other being more careful.

Presented with such a dichotomy, I doubt the average person would grossly
over-estimate his own skill, though he might over-estimate his own
carefulness. Anyone find any studies on this?

I've also, in casual conversations, used the proxy of "loving" driving. Very
few people love to drive, unconditionally. For those that don't, eliminating
any mechanical aspect of operating the vehicle is potentially appealling,
since the whole experience is a degree of necessary evil.

Another proxy I used is transmission preference in "stop and go" traffic,
though this is just a specific case of the mechanical part of driving a car.

------
RyanMcGreal
>Fifty-two minutes after leaving St. George’s, Bermuda, on its way to Boston,
the Royal Majesty’s GPS antenna cable became detached from the GPS antenna.
This placed the GPS in dead-reckoning mode, which does not take into
consideration wind or sea changes. The degraded GPS continued to feed the
ship’s autopilot. No one noticed the change in GPS status, even though the GPS
position was supposed to be checked hourly against the Loran-C radio
navigation system, which is accurate by roughly one-half to 1 nautical mile at
sea, and a quarter-mile as a ship approaches shore.

Is there some reason why the GPS device couldn't raise an alarm when it
dropped into dead reckoning mode?

------
RiderOfGiraffes
This is a fascinating article. We are in constant dialog with our customers
about how much the systems we provide should do automatically. They always
want it to be entirely automatic, and we keep telling them that they need to
keep their staff engaged, alert, and prepared for emergencies.

This is a fantastic article to help them understand, and I'll be researching
the incidents mentioned. I already knew two of them, the others will be
invaluable.

Thank you - I wish I could up-vote you many times.

------
xcombinator
I disagree from the article. There is no paradox in automation, just a simple
sensationalism desire from the authors.

Everyday, there are thousands of accidents in the world(mainly cars) just
because human errors,like distractions, and if no automation existed there
will be dozens of airplane accidents everyday, given the enormous number of
flights.

It's not human intervention what is needed, it's just something as simple as a
proper error reporting system. If an accelerometer fails and nothing happens,
just report :ACCELEROMETER FAILED and problem solved. That a GPS failed, just
report: GPS FAILED, and problem solved. Just change the redundant
accelerometer, GPS when in land-port.

This is what human beings do, when one eye-ear-inertial information has
nothing to do with the other, we got dizziness sick.

If someone designed the system badly, is not fault of automation, it's fault
of the engineers. Debugging and testing is important.

~~~
RiderOfGiraffes
They're not saying that automation doesn't reduce accidents. They are saying
that the accidents that occur now are sometimes caused by the human operators
becoming deskilled because of the automation.

When you say:

    
    
        " ... it's just something as simple
          as a proper error reporting system."
    

you make it clear that you don't work in automated real-time systems. I do,
and it's more complex than you seem to think.

I know of a case similar to the grounding of the Royal Majesty. In that case,
again, the GPS antenna became disconnected. There was a proper error reporting
procedure that was defined and documented, and yet still no one fixed the
antenna, and still everyone trusted the AIS/GPS system even though it was
reporting as broken. In the same way that Windows users are "trained" to click
"OK" on every pop-up dialog, so they were "trained" by their day-in, day-out
experience to trust the system even though they'd been told it was broken.

You also say:

    
    
        Debugging and testing is important.
    

I don't think there would be many who would disagree with you, but are you
suggesting that the systems described had had no debugging or testing? Of
course not. They had been debugged and tested. Exhaustively. And knowing
something of the field, most likely rather more than you imagine.

Bugs remain, and when they surface, humans need to be able to take over
swiftly and accurately. Increasing automation makes it less likely that they
do so. Even if you've had extensive and intensive training, if you don't use
it for six months then you are unlikely to remember it all in an emergency.

That's the message of the article, and I endorse it, because I personally have
seen it happening.

~~~
xcombinator
Hello I never said it was simple, I say it has to be simple. But the human
being is way complex and error reporting are so simple. Simple doesn't mean
"easy". IMHO engineers try to do complex systems they don't understand,
instead of simpler ones.

I have worked in automated real-time systems, and I was not bad at it. And
yes, I have seen real industries working with duct tape fixes, hanging cables
when there was parts movement,that was a calling for disaster, but I will
never take responsibility of it with my handstroke because if I let them go I
will go to jail if it happens. I know what is stopping a production chain for
doing something the "right way" and having the owners eyes on you while they
are losing thousands of euros every not working minute.

That is the proper and original meaning of Murphy Law, if you let something to
happen, it will.

Even when proper training, people do stupid things just because they can when
you let them. I had in my family an airplane pilot from the early days when
50% of their colleagues had died from crashes. A friend of him died trying a
"macho" demonstration to her girlfriend with his little plane. If you try to
do stupid things with a commercial airline, it won't let you unless you
deactivate the automation(and give explanation of why you did so).

------
billswift
>Checking the accuracy of the GPS system and autopilot perhaps seemed like a
waste of time to the watch officers, like checking a watch against the
Coordinated Universal Time clock every hour.

If the watchstanders didn't check it at least when they came on duty, and they
obviously didn't, they are criminally incompetent.

