Hacker News new | past | comments | ask | show | jobs | submit login

The path to a disaster has been compared to a tunnel [0]. You can escape from the tunnel at many points, but you may not realize it.

Trying to find the 'real cause' is a fool's errand, because there are many places and ways to avoid the outcome.

I do take your meaning, reducing speed and following well established rules would have almost certainly have saved them.

0. PDF: http://www.leonardo-in-flight.nl/PDF/FieldGuide%20to%20Human...

Amazon: https://www.amazon.com/Field-Guide-Understanding-Human-Error...

I've never heard the tunnel analogy. Seems odd, since tunnels generally only have one entrance and one exit...

In a lot of fields where the stakes are high and mistakes are costly (aviation and emergency services are the two I'm most familiar with) the analogy used is a chain. Break any link in the chain and you prevent the event.


It's the perception tunnel. There are many branches, but only one route is followed. In hindsight, you can see all the branches and call out everything that went wrong.

Normally that is called "tunnel vision".


(figuratively) The tendency to focus one's attention on one specific idea or viewpoint, to the exclusion of everything else; a one-track mind.

> I've never heard the tunnel analogy. Seems odd, since tunnels generally only have one entrance and one exit...

Road and train tunnels have many emergency exits...

Generally they have a parallel tunnel (either for traffic in the other direction, or specifically for emergency egress), and connections between them. They still take you to the same general place.

I think the idea is that if you are stuck in a dark train tunnel and there's a train coming towards you, there may be doorways, recesses, etc in the walls that are invisible without the proper tools (flashlight, etc).

Urban tunnels have many regular entries and exits, as well as emergency exits, not to speak of emergency crossovers.

I prefer the swiss cheese model.


"In the Swiss Cheese model, an organisation's defenses against failure are modeled as a series of barriers, represented as slices of cheese. The holes in the slices represent weaknesses in individual parts of the system and are continually varying in size and position across the slices. The system produces failures when a hole in each slice momentarily aligns, permitting (in Reason's words) "a trajectory of accident opportunity", so that a hazard passes through holes in all of the slices, leading to a failure."

A series of minor failures that combine for a serious crisis seems very relatable.

Yup! My cousin & cousin-in-law work to maintain human health at some nuclear power plants in Canada. They use the swiss-cheese model, and whenever they notice that one layer missed something, they do a re-evaulation of that safety layer.

Aha! I sense an opportunity for an enterprising statistical physicist to model this with https://en.wikipedia.org/wiki/Percolation_theory :)

As we say when teaching new riders to ride a motorcycle, a crash is often an intersection of factors. If you remove even one of those issues it likely would have prevented it.

I've noticed a lot of learners (eg in YouTube videos) crash because they grab at the handlebars, grabbing the throttle and inadvertently accelerating.

Is there a reason the throttle can't be reversed so that grabbing would reduce throttle and slow the bike?

(FWIW, I'm a fully licensed motorcyclist, don't currently ride.)

Are you recommending the 2nd edition specifically? There is a 3rd edition available: https://www.amazon.com/Field-Guide-Understanding-Human-Error...

No, I just didn't notice (speaking of confusing UI, Amazon seems to do everything possible to move UI elements small and large on a basis so arbitrary as to seem completely random)

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact