What would really be interesting is why he "failed to make his case" according to executives.
Due to the short timescale to build their presentation, they re-used info from existing presentations. Unfortunately, the same info had previously been used to demonstrate why their o-rings were safe. The NASA people basically said, 'last time you showed me this graph it meant things were safe, this time it means things are dangerous. What gives?'
The decision makers were looking for excuses to move forward, and that gave them an excuse to ignore the warnings.
It just goes to show you that even when everyone is paying attention, things still go wrong. Some people heard him and made the call that it was still safe. They were wrong. That, unfortunately, is the state of the art today (or it was back in the 80's). The alternative is to stay on the ground.
But yes, shit does happen, and nobody climbing abord the orbiter is under any illusion that it is a safe thing to do.
Nobody is forced into an orbiter - people BEG for the opportunity. We gave them that opportunity, working in good faith to the best of our ability. Sometimes it doesn't work out. Sometimes things break. Sometimes people screw up. We all know the risks.
You can sit on the porch with a near 100% safety record or you can give it a try. Your choice.
What I cannot understand is: an unknown, unforseen contingency is a completely different thing than an engineer pointing out "this WILL fail, it will blow up and I have proof" and there really should not be any excuse for ignoring a warning like this... yes, you cannot make it 100% safe but you should at least aim to make it as safe as humanly possible given your current level of technology and knowledge... so, in my book overriding an engineer saying "this WILL fail and it'll blow up" is actually negligent man slaughter. When I get into my car in the morning and don't care that the brakes aren't working even my mechanic told me my brake lines were cut, what would you call that?
At the end of the day, if even ONE person demonstrates scientific or engineering knowledge that shows a serious safety concern, then why would you actively choose to ignore it. Period.
NASA management - whether it be by organisational process and or personally identifiable decision making - failed in their responsibilities in spectacular fashion!
I do not believe that if someone knew with 100% certainty that Challenger would blow up that it would ever have launched. The trouble came in in the judgment of that risk. In this case, from what I've read, they got it wrong - very wrong.
You can argue about how certain they have to be, or how negligent people were to ignore estimated failure probabilities of whatever magnitude. But it's not like someone says, "this will blow up 85% of the time, period. Make a call." It's more subtle, complex, and less concrete than that.
1. Note that this is not equavlent to "if it blew up, they got it wrong.". Sometimes the small, properly calculated risk blows up on you just because you're unlucky - which is different from a miscalculated risk blowing up on you.
O-rings are supposed to seal on compression, not expansion.
As it is now, the O-rings are getting blown out of their tracks but still managing to seal the whole assembly quickly enough.
The above unplanned behavior, which is the only thing preventing a hull loss (and a crew loss since there's no provision for escape) is sufficiently iffy that sooner or later we're likely to run out of luck.
(I'd also add about the Columbia loss that NASA had a "can't do" attitude towards the problem they observed of the foam hitting the wing. Hardly a "crew first" attitude.)
When I visit a NASA center, there are posters up all over saying "If it's not safe, say so." Part of the reason for the 2-year grounding of the Shuttle fleet, post-Challenger, was to put in place a stronger culture of safety at NASA.
A commenter mentioned the false dichotomy of "engineers vs. managers". It's a hard call, as an engineer, to disappoint a manager (or a whole line of managers, all the way up) with a call to solve a possible problem. Civil engineers may be more used to this sort of accountability.