
The Hazards of Going on Autopilot - krallja
http://www.newyorker.com/science/maria-konnikova/hazards-automation
======
ghjm
I would like to object to the following statement from the article:

> While lifeguards are taught all the possible signs of a person who is
> drowning, pilots don’t receive elaborate training on all the things that can
> go wrong, precisely because the many things that can go wrong so rarely do.

In fact, pilots are given extensive training in the failure modes of their
airplanes. When Casner got his private pilot license, he should have been told
to memorize the emergency procedures for his airplane. De novo private pilots
are not given the same level of training as airplane pilots, but failure modes
and "interesting-looking instrument panels" are crucial components of
instrument training.

More generally, I don't disagree with his conclusions regarding the problems
with cockpit automation, but they are hardly novel - the industry has been
aware of this for years. However, the service life of an airliner is measured
in decades, and the regulatory environment, for excellent reasons, doesn't
allow us to change the avionics on existing airliners without re-certifying
the new systems and re-training all the pilots.

Last but not least, I'd like to point out that even if the airplane is on
autopilot, there is still plenty for the pilots to do. For example, they are
supposed to brief the instrument approach procedure and the missed approach
procedure every single time they fly it. A lot of pilots relax this discipline
(if you've flown into the same airport hundreds of times, it's hard to argue
that re-reading the procedure _yet again_ has much value). But relaxing it to
the point of just idly chatting all the way down the glideslope? That's not a
problem with the cockpit systems, and it's hard to see how better human
factors would have magically turned these incredibly irresponsible people into
good pilots.

~~~
the_af
Isn't the argument from the article that the pilots are just idly chatting all
the way down _because of_ excessive automation?

If, like the article claims, human error is the leading cause of accidents
(and I assume it is implied that distraction is a major factor in human error,
though the article doesn't explicitly claim this), isn't mindless automation a
major problem that should be addressed? To quote NASA's researcher: "Companies
were introducing increasingly specialized automated functions to address
particular errors without looking at their over-all effects [...] As it stood,
increased automation hadn't reduced human errors on the whole; it had simply
changed their form."

I don't know if it's accurate to call these pilots "irresponsible people".
Probably it is. But how does this help in reducing accidents?

~~~
gwern
> Isn't the argument from the article that the pilots are just idly chatting
> all the way down because of excessive automation?

Yeah, but there seems to be an implicit argument appended: "and this causes
more accidents than the automation prevents". I'm not sure about this one.
Aren't jet planes ever safer to fly in? I was under the impression that the
death rates kept going down. So the backfire effect from this automation can't
be _too_ bad or else net safety wouldn't increase.

I think you could turn it into a weaker more valid argument: 'and the mind-
wandering sets a bound on how safe air travel can ever get with human pilots
because automation itself introduce human error'. But when you make it
explicit like that, it starts to look like an argument for taking humans out
of the loop entirely...

~~~
ekianjo
In the near/mid-term it would probably be a good course of action to have
entirely automated airplanes, and "pilots" (or technicians or whatever you
would call them) specialized in handling automation failures. That way, there
would be no question about what their role is in the plane.

~~~
liotier
Pilots are all about handling failures - that is pretty much what their whole
career is focused on. The normal operation of a modern aircraft is pretty
boring - what makes a pilot worth his salary is all the continuous training
and drilling that embedded emergency procedures into his mind at the reflex
level... It is a lifelong process, regularly updated through new hardware and
modified methods in response to incidents.

------
sokoloff
Colgan 3407 crashed because of a profound and astounding lack of basic
airmanship and knowledge of the effects of icing on aircraft performance.

Yes, the autopilot helped mask the knowledge problem and presented them with a
situation where that demonstrated lack of knowledge (that it takes more power
to fly an airplane with ice on the wings) contributed to a stick shaker event
(a warning of impending aerodynamic stall due to excessive angle of attack)
whereupon the captain (and it's hard to use that term given the airmanship
circus that ensued) decided to hold the stick back, exacerbating the stall
condition rather than doing what every 5-hour pilot is taught to do upon a
stall warning: level the wings, lower the nose, and add power.

The autopilot is a minuscule contributor to that accident, IMO. At most, the
presence of autopilots allowed unqualified airman to not wash out of airline
jobs, but to blame that crash on the autopilot is to misunderstand how non-
complex that situation was that swallowed the crew and all aboard.

~~~
dba7dba
> Colgan 3407 crashed because of a profound and astounding lack of basic
> airmanship

I agree. I think the New Yorker just picked wrong incident to try to advance
their theory. The particular pilot in Colgan 3407 flight FAILED a check ride
in another airline. I understand this check ride is after you have already
flown hundreds/thousands of flights and he somehow failed it.

~~~
ubernostrum
It's not just the Colgan incident; investigation into the Air France 447 crash
concluded the crew did not recognize a stall, and may have been relying on an
incorrect belief that the aircraft would not accept inputs that would produce
a stall (which, for Airbus aircraft, is true except in alternate-law
situations which occur when -- as with AF447 -- the flight systems notice a
fault in data).

~~~
liotier
AF447 is also a wrong stall response, but with different reasons: the pilot
was incorrectly second-guessing what the plane was trying to tell him.

------
damian2000
I don't know if it was mentioned in the article, but after the stick shaker
there was also an automatic "stick pusher" event whereby the aircraft tried to
automatically recover from an impending stall, when the pilots don't. The
pilot overode it by pulling back on the stick thereby putting it into a stall.

------
Strilanc
An off the wall idea: have the pilot duplicate work being done by the
computer. Not to control the plane, but just to keep them alert.

i.e. imagine a driverless car that only works when you tell it to do
approximately what it was going to do anyway. Otherwise it beeps at you and
pulls over.

~~~
damian2000
I think pilots are already meant to do that to a certain degree - i.e. monitor
their instruments, navigation, and understand what the autopilot is doing at
all times. I think its called situational awareness.

------
ejdyksen
See also, Children of the Magenta, a fantastic talk from American Airlines in
the late 90s:

[http://vimeo.com/64502012](http://vimeo.com/64502012)

(Magenta refers to the color on the displays of the Flight
Directory/Autopilot's projected path)

------
phrogdriver
The suggestion that complacency due to automated aircraft systems is somehow
not acknowledged or well-known is simply false. It is a well documented
occurence that we train for, both in terms of recognizing it in ourselves and
others, and in combating it. Most of the "we didn't get to fly with all of
this new fangled autopilot" is as misguided as the "we should add another
computer to solve this complacency problem too."

------
laggyluke
Makes me wonder, how often does accidents like Colgan happen that don't result
in a crash?

Are they even public? Investigated?

------
chanon
The main problem (quote from the article):

"What we’re doing is using human beings as safety nets or backups to
computers, and that’s completely backward"

Made me stop and think a bit when I read this, very true.

~~~
snowwrestler
I don't think it's backwards; humans are much more flexible and capable than
computers, so it is right that they back up the computers not vice-versa.

~~~
LnxPrgr3
Computers overriding human instructions have also caused crashes.

[http://en.wikipedia.org/wiki/Scandinavian_Airlines_Flight_75...](http://en.wikipedia.org/wiki/Scandinavian_Airlines_Flight_751)
\-- engines ingested ice and started surging. Pilots commanded reduced power,
but the Automatic Thrust Restoration system no one told them was there
countered that move, leading to a double engine failure and a crash landing.

[http://en.wikipedia.org/wiki/China_Airlines_Flight_140](http://en.wikipedia.org/wiki/China_Airlines_Flight_140)
\-- pilot accidentally flipped the takeoff/go-around switch, leading the
autopilot to counter the crew's attempt to recover, causing a stall and crash.

