
What We Can Learn From Aviation, Civil Engineering, Other Safety-critical Fields - pyb
http://danluu.com/wat/
======
jwmerrill
A lot of the problematic behavior described here can be seen as examples of
people optimizing for the most common/most likely scenario, and ignoring the
potentially very large costs of unlikely but possible scenarios.

Nassim Taleb has written extensively about this subject (and the converse,
where people fail to account for potentially very large rewards of unlikely
but possible scenarios):

[http://www.fooledbyrandomness.com/](http://www.fooledbyrandomness.com/)

~~~
sopooneo
Relatedly, what about the idea that people move to their own personal risk
tolerance, which may not be aligned with the company's? For instance, I might
figure that a shortcut to get my component completed will have a 1% risk of
destroying the project and getting me fired. That's acceptable to me, so I do
it. But if there are multiple people in similar positions, each creating
critical components, and making the same decision, all of a sudden the project
is very likely to fail.

------
brudgers
Related:
[https://news.ycombinator.com/item?id=10790961](https://news.ycombinator.com/item?id=10790961)

