Hacker News new | past | comments | ask | show | jobs | submit login

Based entirely on the headline: doesn’t every crash prevent most future crashes that would have had the same cause?



Yep, I always loved this line from Antifragile:

> “But recall that this chapter is about layering, units, hierarchies, fractal structure, and the difference between the interest of a unit and those of its subunits. So it is often the mistakes of others that benefit the rest of us—and, sadly, not them. We saw that stressors are information, in the right context. For the antifragile, harm from errors should be less than the benefits. We are talking about some, not all, errors, of course; those that do not destroy a system help prevent larger calamities. The engineer and historian of engineering Henry Petroski presents a very elegant point. Had the Titanic not had that famous accident, as fatal as it was, we would have kept building larger and larger ocean liners and the next disaster would have been even more tragic. So the people who perished were sacrificed for the greater good; they unarguably saved more lives than were lost. The story of the Titanic illustrates the difference between gains for the system and harm to some of its individual parts. The same can be said of the debacle of Fukushima: one can safely say that it made us aware of the problem with nuclear reactors (and small probabilities) and prevented larger catastrophes. (Note that the errors of naive stress testing and reliance on risk models were quite obvious at the time; as with the economic crisis, nobody wanted to listen.)”


The engineer and historian of engineering Henry Petroski presents a very elegant point. Had the Titanic not had that famous accident, as fatal as it was, we would have kept building larger and larger ocean liners and the next disaster would have been even more tragic. So the people who perished were sacrificed for the greater good; they unarguably saved more lives than were lost. The story of the Titanic illustrates the difference between gains for the system and harm to some of its individual parts. The same can be said of the debacle of Fukushima: one can safely say that it made us aware of the problem with nuclear reactors (and small probabilities) and prevented larger catastrophes.

Another interesting argument along the same lines is that if we hadn't bombed Hiroshima and Nagasaki, there would have been no reason for Truman to stop MacArthur from using bombs a hundred times worse in the Korean conflict.

It's a sobering thought regardless of one's opinion on the atomic bombings in Japan. The lesson was going to be learned one way or the other, and arguably humanity got off easy.


I have never heard that one -- but it makes sense.


Only if they’re actually investigated and the lessons learned are translated into real changes. Fortunately, commercial aviation is really good about that.


In this case, however, the problems were already recognized. What this crash did was to provide the impetus to stop arguing and finally do something effective about them.

In "Fate is the Hunter", Gann lamented that the airlines had to be forced by regulation to adopt as straightforward a safety measure as the rotating beacon (flashing red light), a change that pilots of the day had to lobby for.


Earnest Gann's Fate is the Hunter is a must-read for aviation buffs which flies under the radar in discussions on the great books on flying.

I worked in the business and I'd never heard of it until a hardware engineer (with WDC) told me I had to read it. He was right.

It's a nail-biting, seat-of-the-pants diary of the author's career during the early days of commercial aviation. So good.

Highly Recommended.

(me: aircraft dispatcher in my previous life)


The article opens by explaining that the cause of this crash wasn't ever determined- rather, it's the changes in safety regulations that resulted.


This reminds me of the book "Black Box Thinking" that basically says it's more important to learn from mistakes so we don't repeat them than to assign blame. The commercial aviation industry does this better than any other industry, even medicine.


Yes. Blame is poisonous. Humans hate blame, they'll deflect blame onto the innocent, they'll lie and cheat, to avoid blame. The correct focus is prevention of future harm. Not "who is a bad person?" but "what will we do differently next time to avoid this outcome?".

In that frame suddenly the driver stops saying they weren't too drunk to drive and says next time they'll take a cab. The policeman stops saying the suspect "wasn't complying" and agrees that they need training on how to de-escalate situations.

Medicine does have tools for this, the M&M conference (medics discussing why somebody died or had a bad outcome and how to do better next time) is under-used.

Humans are fallible, blaming a specific human makes us feel better but does not prevent the same thing happening again.


I’m completely with you on this and have been implementing this in practice. Sometimes, however, I still meet resistance. Do you have any source/literature that underpins this method?


A nice book explaining some techniques used in aviation and how to transfer them eg to medicine is Atul Gawande's The Checklist Manifesto.


... Black Box Thinking, by Matthew Syed.


Blame isn't the only problem. Humans also hate change. Eliminating blame reduces the resistance to implementing better policies but does not completely resolve it.


Aviation accident investigations always assign blame when it can be determined. In this particular case [1]:

> The National Transportation Safety Board determines that the probable cause of this accident was the captain’s inappropriate response to the activation of the stick shaker, which led to an aerodynamic stall from which the airplane did not recover. Contributing to the accident were (1) the flight crew’s failure to monitor airspeed in relation to the rising position of the lowspeed cue, (2) the flight crew’s failure to adhere to sterile cockpit procedures, (3) the captain’s failure to effectively manage the flight, and (4) Colgan Air’s inadequate procedures for airspeed selection and management during approaches in icing conditions.

[1] https://www.ntsb.gov/investigations/AccidentReports/Reports/...


The primary purpose of the NTSB is to prevent future accidents. Any "blame" they assign is in furtherance of that purpose, not punishment. So critical is this to the NTSB's mission that any conclusion the agency draws as to the cause of an accident cannot be submitted as evidence of civil liability[1] in court and every NTSB report includes a footnote to that effect. You'll find it on page 4 of the document you've linked.

[1] The report would be hearsay in a criminal case and admissible only if the government demonstrates an exception to the rule against hearsay evidence.


Woah, slow there.

The NTSB tries to determine "Probably cause". That's a far cry from assigning blame. In particular, on every NTSB report there's a big box specifically saying (my emphasis):

> The NTSB does not assign fault or blame for an accident or incident; rather, as specified by NTSB regulation, “accident/incident investigations are fact-finding proceedings with no formal issues and no adverse parties ... and are not conducted for the purpose of determining the rights or liabilities of any person.” 49 C.F.R. § 831.4. Assignment of fault or legal liability is not relevant to the NTSB’s statutory mission to improve transportation safety by investigating accidents and incidents and issuing safety recommendations.

Furthermore, to "ensure that Safety Board investigations focus only on improving transportation safety, the Board's analysis of factual information and its determination of probable cause cannot be entered as evidence in a court of law."


from the article: "By any measure, the safety record since the crash is unprecedented. Out of more than 90 million flight departures on U.S. airlines carrying billions of passengers since then, there has been just the single death: A woman died on April 17 on a Southwest Airlines Co. flight near Philadelphia when an engine failed, sending debris into a window next to where she was sitting."


Like our chief pilot used to say: "All regulations are written by dead people".


"Written in blood" is the way I heard it, but yes. Agreed.


He would often add: "So do not rush to be the reason for the next one ...".




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: