Hacker News new | past | comments | ask | show | jobs | submit login

If you implemented some changes so the mistake is caught before disastrous consequences, you're already doing better. Well enough to let the 2nd one slide. Even the 3rd. After that, action seems reasonable. It's no longer a mistake, it's a pattern of faulty behavior.



That is a big IF. At some point it comes down to the error type, and if it is a reasonable/honest mistake.

The situation is very different if the fuel cans are hard to distinguish vs if the tech is lazy and falsifying their checklist.

Underlying any safety culture is a one of integrity. No safety culture can tolerate a culture of apathy and indifference.


I expect there's precisely 1 safety culture that can tolerate a culture of apathy and indifference -- one in which no work is ever completed (without infinite headcount).

You apply risk mitigation and work verification to resolve safety issues.

Then you recursively repeat that to account for ineffective performance of the previous level of verification.

Ergo, end productivity per employee is directly proportional to integrity, as it allows you to relax that inefficient infinite (re-)verification.


Exactly! All this talk about man vs system misses the point that man is the system designer, operator, and component.

This is why Boeing cant just solve their situation with more process checks. From the reporting, they are already drowning in redundant quality systems and complexity. What failed was the human elements.

Someone was gaming the system saying that the doors weren't "technically" removed because there was a shoelace (or whatever) holding them in place, Quality assurance was asleep at the wheel, and management was rewarding those behaviors.

Plenty of blame to go around.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: