
'I’ve become isolated': the aftermath of near-doomed QF72 - rosege
https://www.smh.com.au/national/i-ve-become-very-isolated-the-aftermath-of-near-doomed-qf72-20190514-p51n7q.html
======
mopsi
A similar thing happened with the captain of SAS MD-81 in 1993[1]. Clear ice
had formed on the upper side of wings overnight, but it was not detected
during de-icing. Shortly after takeoff, pieces of ice broke off and hit
engines, deforming fan blades, which disturbed airflow and caused right engine
to surge 25 seconds into the flight.

Pilots properly responded with thrust reduction to decrease engine stress, but
unknown to them, the MD-81 had been fitted with Automatic Thrust
Restoration[2] that kept increasing thrust to maintain normal climb setting.
Increased thrust led to surging of the left engine too, until both engines
failed at 78 seconds.

The aircraft crashed into a field and broke into three parts (everyone
survived).[3]

Post-crash analysis determined that reduced thrust would have kept both
engines operational, and maintained thrust would've lead to failure of only
the right engine, which would have allowed to keep flying and perform a safe
landing. ATR increased thrust and caused both engines to fail.

Captain of the flight retired from flying, saying that he lost trust in
airplanes.

[1]
[https://en.wikipedia.org/wiki/Scandinavian_Airlines_Flight_7...](https://en.wikipedia.org/wiki/Scandinavian_Airlines_Flight_751)

[2]
[https://patents.justia.com/patent/4662171](https://patents.justia.com/patent/4662171)

[3] [https://i.imgur.com/k4q0WMm.png](https://i.imgur.com/k4q0WMm.png) and
[https://i.imgur.com/KzPNhsU.jpg](https://i.imgur.com/KzPNhsU.jpg)

~~~
IfOnlyYouKnew
For every vindicated hero, there are probably dozens of pilots who would no
longer be alive without smarter systems.

Airline fatalities decreased 10-fold in the last 30 years, while total miles
travelled increased by the same factor. Flying today is 100x safer now, and I
don't quite understand peoples' fetish with human control.

~~~
whyenot
Where do those figures come from? They appear to be completely inaccurate.
Looking here [1], while there has certainly been a reduction in fatalities
over the past 30 years, it's not even close to 10x. Maybe 2x or 4x. As far as
miles travelled, there has been at best a 2x increase over the past 30 years,
at least for the US[2].

Airline travel certainly has become safer, at least on a per-mile basis, but
100x safer? That's fantasy.

1\.
[https://en.wikipedia.org/wiki/Aviation_accidents_and_inciden...](https://en.wikipedia.org/wiki/Aviation_accidents_and_incidents#Statistics)
2\. [https://www.bts.gov/content/us-passenger-
miles](https://www.bts.gov/content/us-passenger-miles)

~~~
IfOnlyYouKnew
Sorry, I continue calculating from 2000 because it's easier, but the errors
bars of that are starting to grow. Make it 50 years:

[https://en.wikipedia.org/wiki/Aviation_safety#/media/File:19...](https://en.wikipedia.org/wiki/Aviation_safety#/media/File:1970-2018_fatalities_per_revenue_passenger_kilometre_in_air_transport_\(cropped\).png)

In any case, the larger point should be undeniably clear. Air travel is
getting safer and safer.

~~~
jplayer01
You've only established that flying is getting safer. None of this proves or
even implies that most of that improvement in safety is because of added
features like ATR, especially ones that subtly subvert expectations and remove
agency/control from pilots.

~~~
nick_kline
We need standards for ways to recognize when the computer controls have chosen
to override the pilot, why they are doing it, and a way for the human to
override. This is hard, not easy, because there are so many things going. One
place to look and report back.

Tesla has some of these things. It beeps at you when it detects you are about
to hit cars in front of you. If you are in traffic and it doesn't see that
your lane will be going around say a stopped car in a different lane in
traffic you can easily override it. Similarly for the driver can override the
autopilot, and because it has a kind of tactile feeling when you use the
steering wheel to override the self-driving. Still it's far far from perfect,
current evidence being the front end crashes into immobile barriers. But they
have these notions built in of feedback, control, override. I expect every
self-driving vehicle has something different in how they handle these things.
Eventually we'll have standards across car lines. I bet planes are all
different with all those amazingly huge number of controls the pilot has to
understand.

------
mjevans
The humans are required to be there for a reason, and that reason is so
important TWO are required to be inside the control cabin at ALL TIMES. If
they aren't given the final say in what course of action should be happening
at a given moment it's been designed wrong.

"For a pilot, loss of control is the ultimate threat. It's our job to control
the aircraft, and if computers and their software, by design, can remove that
functionality from the pilot, then nothing good is going to come out of that."

~~~
chrischattin
Pilot here. 100% agree. An autopilot isn't autonomous (like the most of the
public think). It's more like flying the plane by pushing buttons vs
manipulating the flight controls. You dial up a heading or altitude and press
execute. You still have to watch it closely and it does mess up quite often
ie: won't capture a localizer, altitude wasn't armed, etc. Procedure in the
cockpit is to call out altitudes like "1000 above" when leveling off.

I've flown an F-16 where the fly by wire system won't let you over G the jet.
So, I think some level of automation is good in that regard. But, it's very
subtle and you feel in control at all times. I would never fly an aircraft
that would actively take over control of the aircraft... because the
automation messes up all the time.

~~~
nabla9
HN has had two classes of aviation technology articles recently:

\- Horrible examples of automation going wrong, hard to design and failing
easily.

\- Autonomous app-based air taxis are coming together with automated urban air
traffic control (Uber and others). Market estimated to be worth €1.5 trillion
by Morgan Stanley analysis.

What's your take?

~~~
yholio
Air taxis will become possible only when a quiet method of vertical take off
and landing is developed, if ever. No current VTOL lightweight aircraft would
be able to gain even experimental status in an urban environment, let alone
thousands of them running at all hours of the day. The noise levels propellers
make are absolutely horrendous.

~~~
westmeal
I've always wondered if blasting inverted phase sounds recorded from the
propellers (like noise cancelling headphones) would help a bit with the noise
problem.

~~~
jefftk
You can't cancel noise in general, only at specific chosen points. Noise
cancelling headphones can work because your ears and in a consistent place
relative to the headphones.

------
ralph84
> Qantas Flight 72 is descending in a nose-low altitude for just over 15
> seconds

Yes, aviation-ignorant editor, he really did mean to write attitude before you
changed it to altitude.

~~~
eecc
Uhm, you’re arguing with an automaton, a spelling checker.

Apt and ironic, given the circumstances

~~~
mprev
More likely a human sub-editor on a paper like the Sydney Morning Herald,
surely.

~~~
rosege
Yes I would think so too

------
ncmncm
This is heartbreaking. Only a markedly superior flight officer would be
affected the way he was, and it made him unable to continue. The flying public
loses one of its best protectors.

~~~
toss1
Exactly -- "The flying public loses one of its best protectors."

And the loss is the direct result of badly designed and badly implemented
software automation.

Yet, software continues to eat everything, and large swaths of the industry
praise a "move fast and break things" attitude about it. Even life-critical
industries consider it OK to make airframe changes rendering a passenger
aircraft unstable in parts of the performance envelope and patch it over with
software supposedly compensating for those new flight characteristics,
dependent on a single faulty sensor and no input sanity checking. After 346
people die in two incidents, they reconsider, only after forced by regulators.

This <it'll be OK, just patch it> attitude needs to be eradicated.

~~~
makomk
The thing is, the software automation _wasn 't_ badly designed and badly
implemented. This isn't like Boeing's non-redundant MCAS system - a huge
amount of engineering work was put into making every part of the system
redundant and robust against erroneous data from a failing component and
verifying that it worked as intended. It's just that some really weird, rare
failure mode that no-one anticipated or ever found the cause of somehow
created a data pattern which foiled all those checks.

~~~
toss1
"making every part of the system redundant"?

Really? From every account I've read, it depended entirely upon a single
flight attitude sensor, and did no sanity checking whatsoever against other
data, sensors, or inputs. There was an _extra-cost-option_ for a second AOA
sensor and an obscure cockpit light that would tell when the two AOA sensors
disagreed, but those were not installed in either of the crashed airplanes.

It seems at the very least, such a critical system should have three primary
sensors with full algorithms to check for disagreement, plus checking against
the artificial horizon display inputs, airspeed, throttle, etc. to determine
if they were _actually_ in the part of the envelope where the MCAS would be
useful.

So, unless it there are a number of checks & features that have been written
in no account I've read, it's really bad design -- software.

Moreover, they could have decided to NOT design the airframe so that it would
have deadly characteristics requiring a software patch to hide.

Again, really bad design - airframe.

Or, they could have decided that this is a potential critical and deadly
failure mode which properly required extra training and a new Type Rating for
pilots to fly the new aircraft. But instead they decided to try to bury it in
the same type and an hour of iPad training, so that their airline customers
would see the a lower overall cost of the new model.

Again, really bad design -- process.

Perhaps you can point me to some documentation I'm missing here, but this is
what I've consistently gathered from the substantial number of articles I've
read.

~~~
makomk
Boeing's disasterous MCAS system relied entirely on a single AOA sensor and
did no real sanity checking, yes. The Airbus plane this article is about had
three AOA sensors fed through three redundant ADIRU modules, with the data
being checked and cross-checked by each of the three primary and two secondary
flight control computers, each of which also had an internal monitoring
channel running independently-written software on seperate hardware checking
its internal calculations.

The reason Qantas Flight 72 only nose-dived twice, if I'm understanding the
incident report correctly, is that each nose-dive caused the internal
monitoring to fail and that part of the flight control computer responsible to
be faulted out for the rest of the flight. After the second nose dive, all
three of the primary flight computers had faulted, disabling the affected
flight control features.

------
rdiddly
Interesting how what is essentially a _success_ in handling a crisis, can
still be traumatic. It seems like I recall Chesley Sullenberger being on
record stating that he and the rest of the crew of that "miracle on the
Hudson" flight suffered from PTSD symptoms for weeks.

~~~
PaulHoule
It's not that hard to get.

I hit a dog with my car and had flashbacks and intrusive thoughts about it for
about three weeks.

------
usr1106
A write-up of the technical side
[http://avherald.com/h?article=40de5374](http://avherald.com/h?article=40de5374)

------
xfitm3
This really resonates with me:

"One thing is certain: the computers blocked my control inputs. For a pilot,
loss of control is the ultimate threat. It's our job to control the aircraft,
and if computers and their software, by design, can remove that functionality
from the pilot, then nothing good is going to come out of that."

------
sgt101
The lesson is that automation can fail suddenly, even in domains like
aerospace which are unusual in that they have an infrastructure of
professionalism and safety. If an automatic system fails and then leaves the
system in a position that humans can't recover from then the failure will be
complete.

This is not real supervision, automatic systems must be designed to relinquish
control before the situation is dangerous.

------
PaulHoule
Note that New Zealand regulators were inspired by QF72 to provoke the
following accident:

[https://en.wikipedia.org/wiki/XL_Airways_Germany_Flight_888T](https://en.wikipedia.org/wiki/XL_Airways_Germany_Flight_888T)

which reads like something right out of Perrow's "Normal Accidents" book in
that it was an incredible confluence of hardware and human failures.

~~~
mcguire
"Provoke"? They seem unrelated.

~~~
PaulHoule
The wikipedia article doesn't explain it well compared to the accident
reports.

The regulators were trying to test the envelope protection system under
stressful conditions and it turned out that the AoA vanes were non-functioning
which contributed to the crash.

Thus it is relevant to the later 737 MAX crashes.

It is a "normal accident" because of the human factors: e.g. no flight plan
filed because it was a test flight, regulators ordering the pilots to defy the
air traffic controllers, regulators ordering pilots to go forward with a test
they didn't want to do... And on top of it all a maintenance error.

------
nexuist
Aren't there standards on what is considered a safe flight maneuver? Small
adjustments here and there are fine, but sudden pitch downs like those
encountered on QF72 and with MCAS should be cross checked and analyzed by the
computer systems. This is the whole point of triple redundancy, not just to
maintain an accurate reading but also to detect when the reading accuracy has
failed so you can respond accordingly.

Of course for every QF72 there are probably a bunch of injuries and deaths
attributed to pilot error, which is why the computers are there, but at some
point there has to be a middle ground. If you defer control to the computers
in all situations, what's the point of the pilots?

If these computers and their software are smart enough to make life critical
decisions, why can't they make the most important decision of all: should I
stop?

~~~
pmyteh
Yes, there are standards. The full Australian accident report finds that the 3
times in 29M flying hours that this issue occurred were within the standard
(for a non-catastrophic issue, as they found this was). The systems did indeed
cross-check, but there was an algorithmic edge case where the incorrect data
was inappropriately used anyway, as happened here.

It also did give control back to the pilots: after the second nose-down, the
control law reverted as designed to 'alternate' (a degraded state) which
removed the part of the envelope protection causing the nose downs. And the
pilots also correctly switched to the backup ADIRU, which disconnected the
faulty data from (most of) the aircraft's systems.

For me, the interesting part of the article is the severe consequences for the
pilot's mental health. A serious, but non-catastrophic problem occurs, and is
skillfully dealt with by the crew (whose training is designed for exactly
this). There are a number of injuries, but none are life-threatening. But the
pilot's sense of responsibility, the feeling that it could all have been much
worse, and a loss of faith in the aircraft, results in long term disability
despite the 'successful' result.

~~~
dbt00
Yeah I found that pretty moving. It's worth pointing out he continued to fly
for 8 more years between the incident in 2008 and his retirement in 2016, but
seems like he was really struggling most of that time.

------
rosege
Its not only Boeing thats had issues with automation:

"I've learnt from these events, but none have generated the body response or
trauma that this one has on October 7, 2008. This scenario involving
computers, denial of control and potential mass casualties is at a different
level. It seems we've survived a science-fiction scenario, a No Man's Land of
automation failure on an unprecedented scale."

~~~
CydeWeys
At least none of these incidents killed anyone. Boeing's automation (MCAS
specifically) has a lot of blood on its hands.

~~~
rosege
Yes good point! But it seems by luck/good piloting that it didn't kill anyone.

~~~
makomk
Less luck, more good design. The accident investigation report reckons that
this was close to the worst-case scenario: the first pitch-down was almost at
the maximum possible, the other fault protection systems made it unlikely that
more than two such uncommanded pitch-downs could happen in a flight, and it
wasn't possible for this to happen anywhere near close enough to the ground
that the pilots wouldn't be able to recover even if they panicked.

------
craftinator
Anyone know why autopilot cutoff switches aren't a thing? I know these
aircraft aren't designed for pure manual operation, but I don't understand why
there is not a way to downgrade the level of computer control, turn off all
but the most simple autopilot functions. Anyone here in the know on this?

~~~
gbacon
They are, even with autopilots in general-aviation airplanes that commonly
have a disable button on the control yoke, a button on the panel, and a
circuit breaker to cut George's power.

~~~
craftinator
Ah I was more aluding to the automatic system that caused the crashes (MCAS I
believe), but after a bit of research it seems this is not an autopilot
function, more of a flight assist function. Still, it kinda blows my mind that
any system that can forcibly counteract the pilot's control could not come
with an off switch. I'd feel uncomfortable if my car didn't have an off switch
for traction control.

------
snowwindwaves
An automation failure is scary because nobody understands what is going on or
what can be done about it. Mechanical failures also occur and the risk is
accepted. What are the odds of a bird strike? What are the odds of a bird
strike being dangerous or catastrophic? Although Boeing isn't helping,
Probably the odds or number of occurrences of mechanical failures with bad
outcomes is way higher than For electronics or automation.

Is it a Similar situation to coal and nuclear power? Better the devil you
know.

~~~
simion314
Related to Boeing old stats may be irrelevant since a big change in the
process happened and the MAX plane updates were rushed, corners were cut,
pilots were not informed on the changes and on top of that after the first
crash Boeing proved that they would risk other crashes then grounding the
planes until the fix is ready and even on top of that still after the first
crash the pilots did not know that disagree lights won't turn on on some
planes. It looks to me and I hope the investigation would find the proof that
Boeing hoped they can push an update fast enough before the next crash and
that with some luck the first crash fault would not be conclusive and they
could blame the pilots or the airline.

------
craftinator
This story brought to you by Boeing; please try out our new 737 Max at your
next opportunity!

------
guicho271828
related article
[https://en.wikipedia.org/wiki/Qantas_Flight_72#FCPC_design_l...](https://en.wikipedia.org/wiki/Qantas_Flight_72#FCPC_design_limitation)

------
airthrowway2
I believe the pilots should intimately undestand the systems and the
automation and have the ability to understand what the plane does and why,
especially when it is wrong. In this case knowing the FBW envelope protections
of airbus it was clear that the system was responding to a high angle of
attack situation to avoid stalling, of course it was wrong and after
disconnecting the autopilot the pilot should have immediately turned off the 2
out of the 3 ADIRUs to force the plane into alternate law where those
protections are off.

------
tomohawk
As we go through life, we place faith in things based on past experience or on
what other people say. In this case, the pilot lost faith in the aircraft
automation and could not ever really trust it again. He couldn't live in the
illusion that hurtling along at 600mph with computers in control is inherently
safe at all times. His new reality is that computers bear watching at all
times. That sort of hyper vigilance has got to be extremely taxing.

It's important to not place faith in things.

------
checker659
What are the odds Boeing paid for this article to be published? This incident
happened in 2008.

[https://en.wikipedia.org/wiki/Qantas_Flight_72](https://en.wikipedia.org/wiki/Qantas_Flight_72)

~~~
spuz
Seeing as the article doesn't specifically say anything negative about Airbus
and it's an extract from a book released this month, I would find that
incredibly unlikely.

~~~
eecc
Indeed it reads that as the final bits of data streamed out of the recordings,
Airbus realizing what a lucky break they had when no one died from this f*ck-
up, they rushed to fix the software and issue new procedures to turn off the
dodgy computers.

In the more recent cases, with 100% death counts, I read unsubtle accusations
at third world pilots not knowing how to fly planes

