
What can doctors learn from pilots and cyclists? - Silhouette
http://www.bbc.co.uk/news/health-35929557
======
paol
> [Aviation] professionals are given every reason to cooperate, because they
> know that investigations are not about finding scapegoats, but learning
> lessons. Indeed, professionals are given a legal safe space so they speak up
> honestly(...)

This is probably the single most important thing that aviation got right and I
don't think people understand that well enough, in general.

The attitude in the air industry, when something goes wrong, is 1) understand
what happened; 2) figure out what needs to be changed so it doesn't happen
again. There is no step 3. There _especially_ isn't a step of "punish those
responsible" or some such nonsense.

Only the final goal of improving outcomes matters, and indulging the emotional
human demand for punishment does not necessarily help that goal, it can even
seriously hinder it. If only we could apply the same rationality to, say, the
penal system.

~~~
ksdale
I agree completely that we tend to focus too much on punishment rather than
preventing future harm. That said, one thing that makes the airline industry
unique is the binary nature of problems. If something goes wrong and the plane
doesn't crash, then there's no harm to passengers so it's easier to forgo
punishment. If something goes wrong and the plane crashes, there's no one left
alive to punish. Add to that the fact that there is no possible nefarious
reason for a pilot to accidentally crash a plane. A pilot can crash a plane on
purpose, but if they do something stupid and the plane doesn't crash, they're
likely to be believed when they say it was just a stupid mistake. Compare to a
doctor who doesn't suffer when a patient suffers (at least not physically) and
in fact may even save money from certain oversights. It's much easier to
assume bad intent where mistakes are profitable rather than fatal.

~~~
wnevets
Pilots aren't the only ones who can make a deadly mistake in aviation. If a
traffic controller or maintenance personnel makes a mistake, are they
punished?

~~~
Retric
Sometimes, it depends on who notices the mistake and why.

Consider, a safety first culture can easily penalize someone for showing up to
work drunk _IF_ someone notices before a problem shows up. But, if nobody
notices till after the fact then they should be free to admit it without
problems. Because there where two issues, they where drunk _AND_ nobody
noticed. You can't fix the second issue if you don't know it's a problem.

------
ZeroGravitas
A very good, in-depth look at this is here:

[http://www.newstatesman.com/2014/05/how-mistakes-can-save-
li...](http://www.newstatesman.com/2014/05/how-mistakes-can-save-lives)

A story of how a pilot's wife died in surgery and how he campaigns to have
doctors follow the example of airline crash investigation.

Particularly interesting/infuriating is the bits about how often someone in
the room knew that someone was going to die, or get the wrong leg amputated,
but status games prevented them from intervening.

------
mryan
> Only through proper investigation was it discovered that a key factor was
> clinicians failing to put sterile dressings over the catheter site; the
> medical equivalent of not using antibacterial hand gel.

> The introduction of a five-point checklist - a marginal change - saved 1,500
> lives over 18 months in the state of Michigan alone.

If you find this kind of thing interesting I can recommend The Checklist
Manifesto. The book describes many other situations where the use of
checklists has had similar benefits (particularly in health and aviation).

~~~
mikestew
> The book describes many other situations where the use of checklists has had
> similar benefits (particularly in health and aviation).

However, and I feel that this is a point often missed about this book: a large
part of the improvements in process had little to do with checklists. "Wha-
wha-wha-whaaaat? 'Checklist' is in the title!" Checklists don't do you any
good if you're dealing with doctor's who are way to full of themselves and
don't feel the need to listen to the little people like nurses.

No, the key element to that book was that nurses were empowered to knock
doctor's off their high horses if the checklists weren't followed. The book
isn't about checklists. The book is about:

1\. Develop a framework of practices for your tasks at hand. Checklists are
often used for this. For software, think release management. Tests pass, UA
signed off, etc.

2\. #1 won't make a damned bit of difference until, from top to bottom, anyone
can call out anyone else if short cuts are taken. You're the director of the
hospital? I don't care, you still need to wash your hands before touching that
patient. VP of marketing is wanting us to skip those manual tests to save a
day? Screw you, it's on the list and we're doing it.

The book isn't about checklists, it's about empowerment to make improvements.

~~~
dagurp
I haven't read the book but what you're describing is what's called Crew
Resource Management[1] in aviation. It's especially valuable in cultures where
second guessing your superior is career suicide. CRM stresses that the captain
and first officers are a team and how they should communicate.

[1]
[https://en.wikipedia.org/wiki/Crew_resource_management](https://en.wikipedia.org/wiki/Crew_resource_management)

~~~
mikestew
It's been a few years since I've read the book, but if the book didn't
specifically mention this in relation to aviation, I recall that aviation was
at least was mentioned as an example.

------
nols
Here is an article about medical checklists that is really eye opening:

[http://www.newyorker.com/magazine/2007/12/10/the-
checklist](http://www.newyorker.com/magazine/2007/12/10/the-checklist)

~~~
mariodiana
The most eye opening part about that piece was a letter to the editor, from
some bigwig doctor-administrator at a major hospital, that appeared an issue
or two later. He thought that the whole idea of checklists was ridiculous,
because all that is needed is for everyone to just do what they're supposed to
do.

Seriously.

I'll try to find the letter. (I'm a subscriber and can access back issues
online.) To me, his attitude illustrates the biggest problem of the medical
profession. How could someone, who we have to suppose is an intelligent human
being, miss the entire point of the article! In my opinion, only arrogance
explains it.

~~~
mariodiana
Okay, the letter to the editor is from the January 21 issue, a few weeks
later. Here's the part I take issue with:

"While a retrospective checklist has great value […] a prospective system of
protocols, memorized by all the staff and made part of ongoing care, is more
likely to influence patient outcomes."

You see that part about "memorized by all the staff"? One of the major points
of the checklist piece was that simply going by memorized protocols, whether
you're going to pilot a plane or perform surgery, just does not cut it. The
human animal is prone to failing to follow complicated, extensive protocols
from memory.

The doctor who wrote the letter is a professor emeritus at a medical school. A
screen shot of the letter is here:

[http://i.imgur.com/0DqhvY0.png](http://i.imgur.com/0DqhvY0.png)

~~~
lutorm
The phenomenom you're talking about is called "interrupted checklist" in
aviation. Humans are actually quite good at following memorized lists _unless_
they get interrupted mid-task. When this happens, the risk of a mistake goes
up significantly, because your mind has already "checked off" the task that
you were in the progress of completing when you got interrupted.

See for example:
[http://flighttraining.aopa.org/magazine/2002/October/200210_...](http://flighttraining.aopa.org/magazine/2002/October/200210_Instructor_Report_Interrupted_Checklist_.html)

The psychology behind this is also the reason for the "sterile cockpit"
procedures during critical phases of flight.
[https://en.wikipedia.org/wiki/Sterile_Cockpit_Rule](https://en.wikipedia.org/wiki/Sterile_Cockpit_Rule)

~~~
petra
They have implemented a form "sterile cockpit" in some places when nurses
prepare medications - by having them wear special vests that signal nobody
should interrupt them, and (i think) doing it in a special quiet room.

------
Kaedon
In a similar vein, Andrew Godwin gave an excellent talk at PyCon last year
called "What can programmers learn from pilots?"
[https://www.youtube.com/watch?v=we4G_X91e5w](https://www.youtube.com/watch?v=we4G_X91e5w)

------
Silhouette
I suspect a lot of software developers would benefit from a similar approach
when it comes to improving quality and, in particular, improving security
practices. There seems to be a striking dichotomy in our industry today
between those who welcome responsible disclosure or even actively encourage it
with things like bounty programs, and those who consider anyone knowing a
potential vulnerability in their systems a danger and will treat even a
genuine attempt to help through responsible disclosure as some sort of threat
to be handled by lawyers or police. I'm fairly sure about which group
generally makes more secure software. However, it would be a welcome cultural
shift if not only support for responsible disclosure but also publicly
acknowledging and discussing security issues were more common in our industry,
so others could learn from the same mistakes.

------
bencoder
Somewhat off topic, but as someone with an interest in aviation I enjoy
reading the aviation safety reports, like the aaib reports [0], airprox
reports [1] or the CHIRP newsletters [2]. (All UK based)

I try to find a similar 'tone of voice' when conducting or participating in
RCA's for incidents during my day job

[0] [https://www.gov.uk/aaib-reports](https://www.gov.uk/aaib-reports)

[1] [https://www.airproxboard.org.uk/Reports-and-
analysis/Monthly...](https://www.airproxboard.org.uk/Reports-and-
analysis/Monthly-summaries/Monthly-Airprox-reviews/)

[2] [https://www.chirp.co.uk/](https://www.chirp.co.uk/)

~~~
sokoloff
> I try to find a similar 'tone of voice' when conducting or participating in
> RCA's for incidents during my day job

Me as well though, as a pilot, I find that the NTSB is a little too free in
about 10% of cases in assigning blame to the flight crew as the primary cause.
Yes, most crashes are crew-caused, but I've read many where the crew gets the
blame as primary rather than as secondary.

In my day job, I've also brought in the concepts of FAR 91.3 and have
explicitly given that level of authority to the team responsible for running
our production systems.

FAR 91.3 reads:

Responsibility and authority of the pilot in command.

(a) The pilot in command of an aircraft is directly responsible for, and is
the final authority as to, the operation of that aircraft.

(b) In an in-flight emergency requiring immediate action, the pilot in command
may deviate from any rule of this part to the extent required to meet that
emergency.

(c) Each pilot in command who deviates from a rule under paragraph (b) of this
section shall, upon the request of the Administrator, send a written report of
that deviation to the Administrator.

The intent is to make clear to the team that they have the final say and that,
while I may make inquiries later as to why they made a certain deviation from
our norms, SOPs, or policies, that they do have the right to do so, and that
all I'm able to ask for is a report. Couple this with demonstrated blameless
treatment of post-mortems and you get a pretty good outcome out of good
people. We do want to know exactly WHO did WHAT, WHEN, HOW, and WHY, but we
use to introspect and make the future better, not to punish.

------
j4ah4n
There's a book "The Checklist Manifesto" which describes this practice - a
great, fast read. [http://amzn.to/1SsqgLu](http://amzn.to/1SsqgLu)

------
nkrisc
From the article:

> Today, aviation is arguably the safest form of transportation. Last year the
> accident rate had dropped to a low of only four fatal crashes from a total
> of 37.6 million flights.

I'm curious, does anyone know how/from where that was derived? That sounds
pretty incredible.

~~~
mfoy_
Presumably the FAA would have records of all flights as you have to submit
flight plans to ATC. Then they just see how many crashes there were that
resulted in loss of life.

I mean, I don't know where I would personally pull those numbers from, myself,
but it seems trivial for an Agency to report on.

~~~
sokoloff
Those aren't the "aviation" numbers, but rather more likely the "scheduled
airlines" numbers.

The fatal rate for all of aviation is well over 4 accidents last year, and
there are not flight plans for even half of the general aviation flights.

There were 5 aviation fatals last month in the US:
[http://www.ntsb.gov/_layouts/ntsb.aviation/month.aspx](http://www.ntsb.gov/_layouts/ntsb.aviation/month.aspx)

For scheduled service, I only see one fatal accident last year in the US, and
that was a Cessna 207 (single engine, piston aircraft) that crashed in Alaska,
hardly relevant to scheduled airline service in transport category twin-jets.

[http://www.ntsb.gov/_layouts/ntsb.aviation/brief.aspx?ev_id=...](http://www.ntsb.gov/_layouts/ntsb.aviation/brief.aspx?ev_id=20150718X04523&key=1)
I suspect that most people would not consider this airline service, though it
was operating as scheduled service.

Non-US in 2015 had 4 fatal accidents on scheduled airline service, which is
presumably the numerator the article is citing:

GermanWings 9525 (suicidal pilot):
[https://en.wikipedia.org/wiki/Germanwings_Flight_9525](https://en.wikipedia.org/wiki/Germanwings_Flight_9525)

TransAsia 245 (pilot error subsequent to mechanical failure):
[https://en.wikipedia.org/wiki/TransAsia_Airways_Flight_235](https://en.wikipedia.org/wiki/TransAsia_Airways_Flight_235)

Triguana 257 (poor weather, unclear pilot contribution):
[https://en.wikipedia.org/wiki/Trigana_Air_Service_Flight_257](https://en.wikipedia.org/wiki/Trigana_Air_Service_Flight_257)

Metrojet 9268 (bomb):
[https://en.wikipedia.org/wiki/Metrojet_Flight_9268](https://en.wikipedia.org/wiki/Metrojet_Flight_9268)

------
mchahn
A checklist is a useful form of technology assistance, although ancient. Maybe
some sort of AI technique will make it easier in the future. Sort of an
assistant to guarantee conformance.

------
pcl
This is directly applicable to production outage processes in our industry, as
well.

------
SixSigma
That red is the safest light.

