
How I Almost Destroyed a £50M War Plane and the Normalisation of Deviance (2016) - dddddaviddddd
https://fastjetperformance.com/podcasts/how-i-almost-destroyed-a-50-million-war-plane-when-display-flying-goes-wrong-and-the-normalisation-of-deviance/
======
jacquesm
It's really nice to have a term to pin to a vague concept that had been
floating around in my head for at least a decade. I've come across instances
of this repeatedly but never knew quite how to describe it properly to others.
It's a treacherous slope where the first step is usually completely un-noticed
and by the time people do notice it is way too late. We've always done it like
this is the oft heard excuse but on closer examination that usually isn't
true. Typically at some point in the past there was a much tighter regime and
the present day situation would have definitely not passed muster. But by
inching towards it with very small steps critical thresholds can be exceeded
without notice.

Thanks for posting this. It will help me professionally to formulate my
opinions on such situations more clearly and will serve as a reference point
for future discussion.

~~~
PaulHoule
I think for every 100 people who talk about Vaughn's "Challenger Disaster"
there are maybe 2 or 3 who actually read it.

Like many books in sociology it comes at the problem from multiple
perspectives (in different chapters) such that you can't really hold the
author to any one clear conclusion.

Big picture, the space shuttle was "unsafe at any speed" from the very
beginning. It was estimated that it had a 3% or so chance of blowing up on any
flight which turned out to be just about right in hindsight.

One point of the book was that the Space Shuttle had many things wrong with
it: they had a committee who's job it was to review all the safety risks and
decide which ones they could accept. "Normalization of deviance" was not a
bottom-up matter of bad practices, but a formal part of the organization.

If things had gone a little different with the O-Rings then perhaps the
"Challenger Disaster" wouldn't have happened, but then it would have been
something else. I remember in the early days the thermal tiles were falling
off, but after they had done a few flights they convinced themselves it was
OK. It was the Columbia disaster that proved that they weren't OK and couldn't
ever be OK.

~~~
mcguire
" _...they had a committee who 's job it was to review all the safety risks
and decide which ones they could accept._"

Having a committee dedicated to assessing risks seems like a good idea. Having
a group consciously deciding between risks you can live with, risks you have
to live with, risks you can mitigate, and risks that stop everything is a
fundamentally good idea.

The problem is taking risks without thinking about them, or taking risks
without knowing exactly why they haven't killed you before. And even more
unfortunately, this is all on an experimental project That Cannot Ever Be
Allowed To Fail, Ever™.

~~~
Zenst
Do committees who do such reviews equally ask themselves - if this does fail -
can we live with it and justify our initial risk assessment and stand by it?

So often - risky decisions are made - many pass without incidence and yet,
problems happen. Often such risks are reviewed in hindsight and the weight of
the realisation of risk that pass into actuality. Though when they don't fail,
nobody passes a second thought and over time, such risks get normalised.

~~~
icegreentea2
That's a fair point, but the fundamental fact is that at some point,
someone/something/some process needs to accept a level of risk, or perform
some type of cost/benefit analysis. Sometimes the analysis is straight forward
or easy to criticize. For example, when the costs are high, and benefits
relatively low. It's super easy to criticize the shuttle disasters, because
the cost of cancelling the launches are so obviously lower than the lives of
astronauts (at least in popular discourse).

The issue is that all mechanisms to do so are susceptible to some variant of
the problem you highlighted. As so highlighted in the OP's submission,
individuals can make the exact same mistakes.

I feel that a lot of things comes from the fact that we are constantly
encouraged to over-quantify risk. That is we take things where REALLY we don't
have a good grasp of the real numbers, and then make something up. If we're
good, they'll even be in the right ballpark. If we happen to choose to follow
the course of action, and everything is working "just fine", we go back to our
mental risk model, which we always knew was kinda made up, and then say "well,
I guess I guessed wrong", and assign a lower risk.

~~~
Zenst
Yes, it does get down to risk factors and the parallels with the
reinsurance/insurance market with actuaries, is an aspect not lost upon me. As
we know you can't put a price on a human life - but they do in the end.

------
ramshanker
This was a really good reminder of engineering wisdom.

Over the year I find myself and my team (civil-structural engineers) doing
even more and more riskier designs. Things we wouldn't have considered OK
during our early years. We do get review and approvals from seniors, but maybe
their acceptance is also "normalisation of deviance" as described in the
article. Sharing it with my team.

~~~
Zenst
Would it be a case of the norm was to over-engineer (thinking victorian times
mostly as examples of structures standing and still in use) compared with
today in which we have a finer insight into the maths, which allows us to
engineer just-enough.

Or is it a case of market/management pressure pushing the limits that is the
driver? With those pushing the limits getting a competitive edge - that draws
others into competing in the same way.

------
gchamonlive
When it was being described the checklist a pilot has to go through before
taking off, it resembled, though quite crudely and without life or death
implications, the lengthy checklist the developers at our company had to
follow back when we deployed everything manually. We had synchronization steps
to pull modifications, couple compilation steps, we had to check if
configuration files were added in order to accommodate extra services, there
was cache mechanisms that had to be cleared, after that we needed to check for
database migrations, then restart the systems, run the little tests we had and
manually check if nothing broke. It happened often to see code conflicts, 40
to 50 with varying difficulty every release. It was insane and never quite
worked, we always got weary of release days and never wanted to go through
them as we didn't trust the system (with good reason).

Automation and good practices brought back confidence, so I was wondering two
things.

First most obvious, how come do pilots trust a system that has these layers of
manual security checks with their lives. And second, can't it all be
automated? I understand aircrafts must have independent modules and every
level of integration implies extra risk, but at least in security critical
tests, automation would come a great way in ensuring crew safety, wouldn't it?
Is it too hard? Systems are not flexible to implement automatic security
checklists or is it just a culture in the field to check it all manually?

~~~
miketery
> First most obvious, how come do pilots trust a system that has these layers
> of manual security checks with their lives. And second, can't it all be
> automated?

The trust comes from understanding the system and associated risk that
contribute to conditions where the system is out of normal operating range.
Pilots learn that doing simple checks on parts of the system that are likliest
to contribute to abnormal operations is life saving in the long run. E.g. may
seem simple that control locks werent taken off but it happens and it's why we
check the controls prior to take off. It's why we put the engines to fill
throttle to make sure oil pressure is as expected.

Anytime there is an accident many pilots will read the reports and add an
additional check or reemphasize a current check that they need to continue to
use.

As far as automation. It could fix it. But it increases system complexity so
there will be this valley where for a while the automated system isn't much
better than normal checks, infact might be worse due to over reliance. Also
logic in software or hardware can lie to you in ways that mechanical realities
of moving control surfaces, checking oil level, checking tire pressure can't
be beat (at least with a cheap solution). This is why the FAA for critical
components requires multiple systems doing the same thing and concensus
algorithms.

Recently I bought a lemon for a motorcycle. The ECU started giving me error
codes twenty five minutes into my first ride. Engine started cutting out. I
couldn't start it. On the other hand in a Cessna I could lose all electrical
power (assuming magnetos are still operating) and still operate the plane just
fine, although without radio comms. The Cessna is from 1972, while the
motorcycle from 2017. Granted I could've checked wires or inspected more
carefully but non the less, the 2017 motorcycle is a more complex system.

~~~
phkahler
The motorcycle engine is LESS reliable than a well maintained Cessna. But I
suspect the motorcycle was not well maintained, perhaps not well designed, and
you should not have been driving it.

~~~
jacquesm
Short of a teardown I don't see how the buyer of a second hand motorcycle
would have a chance at spotting such a fault without driving the bike. And
even a teardown might not show the fault, in fact could make it worse.

~~~
lmm
That's equally true for an aeroplane engine - the difference is that no pilot
would skip performing maintenance at the required intervals and there are
maintenance logs etc. in place to enforce this. What makes the Cessna engine
more reliable than the motorbike engine in practice isn't a simpler design but
a better maintenance and operation culture.

~~~
jacquesm
Sure. But the accepted norms for buying second hand aircraft versus second
hand motorcycles are quite different as well, which is why it makes no sense
to say that the buyer should not have driven the motorcycle. The accepted norm
for cars and motorcycles is that you pay the seller and you drive them home,
not that you inspect the detailed maintenance log or do an on-site teardown.
Likely the seller was aware that this bike had problems and simply did not
tell the buyer, something that is close to fraud, and on top of that this
placed the buyer in a potentially dangerous situation, especially on a new
bike.

When I buy cars I do a pretty thorough inspection and an extended test drive.
Even so I once ended up with a car on which the clutch pedal would not return
once every few hundred strokes or so, something that by chance did not happen
even once in a pretty long test drive.

------
Xylakant
About 2 weeks ago a German Luftwaffe Global 5000 Businessjet missed the runway
and nearly crashed in an emergency landing. The currently suspected cause:
Faulty maintenance and incomplete preflight checks.
[http://www.airliners.de/in-schoenefeld-jet-
flugbereitschaft/...](http://www.airliners.de/in-schoenefeld-jet-
flugbereitschaft/49799)

------
gav
An interesting paper about systems on the edge of failure is "'Going solid': a
model of system dynamics and consequences for patient safety" by Cook and
Rasmussen:
[https://qualitysafety.bmj.com/content/14/2/130](https://qualitysafety.bmj.com/content/14/2/130)

There's a good overview in this talk: "Architectural Patterns of Resilient
Distributed Systems" by Ines Sombra
[https://youtu.be/ohvPnJYUW1E?list=PL1Fqq0rxeNehyA4eWbuCwizkq...](https://youtu.be/ohvPnJYUW1E?list=PL1Fqq0rxeNehyA4eWbuCwizkqLeL9-6aE&t=589)

------
fuzzy2
Nice read. I often notice something similar during software development. It's
simple stuff like "I have to do a full rebuild or it won't work" or "I have to
build twice" etc pp. I cannot understand how others can simply ignore what
could indicate a fundamental problem.

It's fucking terrifying.

~~~
dboreham
I notice this too. However I've come to see that the building twice folks are
cranking on writing code and getting praise from management while I'm finding
some bug in yarn and not getting any actual work done..

------
tomohawk
> Before you try and change the world just have a look at the foundation from
> which you are starting it from.

So true. People who think they're doing the right thing can cause so much
damage.

------
voltagex_
Podcast seems to be 404 on libsyn. I was worried it'd gone Spotify exclusive
but the correct feed URL is
[https://feed.pippa.io/public/shows/5bb78b332630bb48399f39fe](https://feed.pippa.io/public/shows/5bb78b332630bb48399f39fe)
and the episode URL is
[https://feed.pippa.io/public/streams/5bb78b332630bb48399f39f...](https://feed.pippa.io/public/streams/5bb78b332630bb48399f39fe/episodes/5bb78b4fac8922f843285be0.mp3)
(as an alternative to the linked article).

------
coldcode
I see this sort of complacent deviance all the time at my job. Thankfully we
don't can't kill people with it. But insular organizations seem to thrive on
doing things "their way" thinking it right, when in fact to anyone else it
would be total lunacy. If people saw what we did I think people on HN would
have a good laugh.

------
RangerScience
This reminds me a lot of this idea:
[https://www.lesswrong.com/posts/Kbm6QnJv9dgWsPHQP/schelling-...](https://www.lesswrong.com/posts/Kbm6QnJv9dgWsPHQP/schelling-
fences-on-slippery-slopes)

"Normalization of deviance" would be the mechanism by which a slope becomes
slippery; a schelling fence (in this case... a checklist?) is then a possible
fix, even if it's just to trigger a re-evaluation of "normal"

------
blunte
I'm amazed that outside a wartime situation that flying a plane like that with
a known equipment failure (of that nature especially) was allowed. And if it
wasn't allowed, I'm surprised it didn't end up in a grounding of the pilot.

~~~
viraptor
It wasn't just the pilot. It was also a few people who heard the idea and said
that's ok. You can't fix systemic issues by punishing selected few. Aviation
seems to be pretty good at that.

------
DoofusOfDeath
IMO the article is excellently written and very compelling. I suggest people
read it for enjoyment, despite the dry title.

~~~
MatthewWilkes
What do you do on a daily basis that makes "How I Almost Destroyed a £50
million War Plane" sound dry?

~~~
DoofusOfDeath
Good point. I was focusing on the "... Normalization of Deviance" part when I
wrote that.

------
intertextuality
A lot of incidents seem to arise from normalization of deviance + self
regulation (especially in the moment). And hubris.

The author's situation reminds me of the Tenerife airport disaster- absolutely
macabre and sobering. [0]

[0]:
[https://en.wikipedia.org/wiki/Tenerife_airport_disaster](https://en.wikipedia.org/wiki/Tenerife_airport_disaster)

------
awinter-py
First heard this term in the danluu 'wtf wtf' blogpost -- so useful. Also
'fast jet performance' is such a cool podcast topic.

~~~
kristianp
This one: [https://danluu.com/wat/](https://danluu.com/wat/)

Great articles there.

------
dsfyu404ed
Everyone here seems to be caught up in waxing poetic about how the
normalization of evidence is a bad thing and talking about preventing it but
there's another side to that coin. If everybody is breaking the process 9x/10
it's not "normalization of deviance", it's because you've got a half baked
process that sucks and the people on the ground know it and work around it.
Obviously you could fill a library with all the material written on how to
write organizational policy but it's not as black and white as "violating
policy = bad"

------
vermontdevil
_Think of the global financial crisis of 2008 when the economy collapsed
because the banks hadn’t been properly regulated because they convinced the
authorities that they could do it themselves._

Also Boeing and the MCAS fiasco

------
toss1
Excellent read and a key concept.

I'd had a general idea about this, but it was crystalized by a vehicle
dynamics engineer and racer who mentored me in both engineering and race
driving. He said:

"You need to know the difference between skill and getting away with
something"

I.e., just because you survive an event does not mean that it is reliable. If
it is based on skill, you can replicate it and reliably survive. If it based
on luck, well . . . take the bit of grace you were given by the Gods and learn
your lesson to not do it again. Mostly, THINK.

------
DoctorOetker
"How I Almost Destroyed a £50M War Plane and the Normalisation of Deviance"

versus

"How I Almost Destroyed a £50M War Plane [while nearly killing myself and my
colleague] and the Normalisation of Deviance"

~~~
lmm
Yeah I thought that was interesting. Perhaps an artifact of how soldiers are
trained to regard their own lives versus their equipment?

------
eridius
"Normalization of Deviance" sounds like the name of a ship from the Culture
series.

------
calabin
Tried to read your article, but the way the email prompt loaded made it
impossible to dismiss so I entered a false address.

I'm running the latest version of Chrome on Ubuntu 16.04 - hope that helps you
diagnose this issue.

Here's a screenshot: [https://imgur.com/kwXdIsa](https://imgur.com/kwXdIsa)

------
vmh1928
The concept of Normalization of Deviance was mentioned in this thread in March
about some comments made on Twitter by a software engineer about the 737 Max
crashes.

[https://news.ycombinator.com/item?id=19414775](https://news.ycombinator.com/item?id=19414775)

------
chriselles
Complacency creep.

Talking to a few Royal Australian Air Force folks last year, apparently they
had a similar-ish culture of “she’ll be right” arrogance/confidence, and then
they had a cluster of accidents that caused a foundation level rethink and
rebuild of a safety/process culture into every flight and every high risk
activity.

It sounds like accident #’s have collapsed since then.

Be careful, aviation fanboy writing, not a qualified pilot/tech/engineer.

------
dang
A few past related discussions:
[https://hn.algolia.com/?query=Normalization%20of%20Deviance%...](https://hn.algolia.com/?query=Normalization%20of%20Deviance%20points%3E20&sort=byDate&dateRange=all&type=story&storyText=false&prefix=false&page=0)

------
burfog
What I find odd is that he chose to get his 0g flight by flying a parabola, in
the style of the vomit comet, instead of just flying in a roll. For example,
roll by 200 degrees and then raise the nose by about 20 degrees. (probably
can't just roll 180 degrees without stalling)

------
pacaro
In my experience this also applies to dating, it’s really easy to overlook
things in the early stages of a relationship (or what might become a
relationship) that turn out to be quite significant later.

------
nan0
This read reminded me of one of the famous TOP GUN movie quotes:

"A good pilot is compelled to always evaluate what's happened, so he can apply
what he's learned. Up there, we gotta push it."

~~~
bond
Top Gun...

------
spsrich
Anybody reminded of Captain Flashheart while reading this?

------
btrettel
When I think of "normalization of deviance" I focus more on mundane things
like speeding.

I'm a cyclist. People often ask me why cyclists break the law so frequently.
The studies I've looked at don't suggest cyclists break the law any more
frequently than drivers do, but they do break different laws. (To my knowledge
I follow every applicable traffic law.) When I ask the driver I'm speaking
with if they speed I frequently get a justification of speeding as if it's
perfectly safe because "everyone does it".

In terms of problems people should focus on, cyclists tend to break laws in
ways that mostly harm only themselves (e.g., running a red light), while
drivers break laws in ways that tend to harm others in addition to themselves
(e.g., speeding, failure to yield, etc.). The fact that drivers vastly
outnumber cyclists also makes the priorities even more clear.

~~~
GordonS
> cyclists tend to break laws in ways that mostly harm only themselves (e.g.,
> running a red light)

Running a red light could easily harm others - a driver could swerve to avoid
you, crashing in the process; if a driver hits you because you ran a light,
that could also cause a chain of accidents; and if a driver killed you because
you ran a red light, that could seriously harm their wellbeing.

~~~
btrettel
> if a driver killed you because you ran a red light, that could seriously
> harm their wellbeing

While there are people who would be seriously bothered if they inadvertently
contributed to a cyclist's death, I think the number of such people is smaller
than the number of people who claim they would be bothered.

One only needs to look at their revealed preferences to show this. Given the
choice between waiting a minute or two (estimated 95th percentile; the median
is likely around 15 seconds) to safely pass a cyclist, many drivers instead
choose to pass a cyclist very closely and dangerously. If they actually would
be bothered if they killed a cyclist, why do they do this? My life is not
worth a minute of someone's time.

Of course, the bad drivers could claim that they didn't know passing a cyclist
closely was dangerous, but in my experience few drivers claim that. I've
spoken to many drivers when stopped at stop lights, trying to understand their
perspective. Often they imply that I _deserve_ to be passed dangerous because
I broke some rules of the road. So they can't claim ignorance.

(The "rules" I broke are not actually rules. For example, I've had people tell
me that I was going far below the speed limit when I was going 17-18 mph in a
15 mph zone. I intentionally picked that road because of the low speed limit,
yet some people want to drive 40+ mph there and apparently that makes me a
jerk who deserves to be run over.)

~~~
nitrogen
You cannot use driver behavior in the absence of an accident to infer the
emotional impact of killing someone...

~~~
btrettel
Why? These drivers often show no remorse for their actions, as I stated. I
don't doubt I'd be a victim of a hit-and-run if any of those drivers actually
did cause a collision.

I've seen arguments that people are "different" when driving, so perhaps
they're a jerk when behind the wheel but a normal person otherwise. I haven't
seen any clear evidence for this but it would indicate that the emotional
impact would be different.

~~~
RHSeeger
> I don't doubt I'd be a victim of a hit-and-run if any of those drivers
> actually did cause a collision.

The only way I can read this is that you believe, of the people that cause a
collision with a bicycle (accidentally or otherwise), all (or almost all) of
them would just leave you to die on the side of the road.

That is a staggeringly low opinion of humanity.

~~~
btrettel
The low opinion is not of humanity, rather, certain drivers who are
particularly dangerous.

I didn't have all drivers who'd be in a collision with a cyclist in mind, just
a large fraction of those who deliberately pass cyclists closely. My wording
was too strong.

Hit and run crashes account for roughly 12% of all crashes in the US, so they
may be more common than you believe: [https://aaafoundation.org/hit-and-run-
crashes-prevalence-con...](https://aaafoundation.org/hit-and-run-crashes-
prevalence-contributing-factors-and-countermeasures/)

------
empath75
Climate change is another example of this probably. Except the whole planet
instead of a plane.

