Hacker News new | comments | show | ask | jobs | submit login

While this sentiment is understandable, it's not justified unless you know how many times people said "this will fail" and it didn't. We only have definite data on this one statement. You can not from that data conclude (I'm not saying there isn't other data) that this was negligent. If every engineer who disagreed with something said "this thing is going to blow up", eventually one would be right. But you can not then infer that that individual was any different than the others and that people should have known this. It's the "monkeys on typewriters" fallacy.

This is science and engineering not statistics. It is not a numbers game or "monkeys on typewriters" or how many bug reports we can file on the same issue to get said issue fixed!

At the end of the day, if even ONE person demonstrates scientific or engineering knowledge that shows a serious safety concern, then why would you actively choose to ignore it. Period.

NASA management - whether it be by organisational process and or personally identifiable decision making - failed in their responsibilities in spectacular fashion!


While I agree (especially with the last sentence), I would point out that the engineering behind these problems is rarely black and white, and hindsight tends to make it look more so than it is.

I do not believe that if someone knew with 100% certainty that Challenger would blow up that it would ever have launched. The trouble came in in the judgment of that risk. In this case, from what I've read, they got it wrong - very wrong[1].

You can argue about how certain they have to be, or how negligent people were to ignore estimated failure probabilities of whatever magnitude. But it's not like someone says, "this will blow up 85% of the time, period. Make a call." It's more subtle, complex, and less concrete than that.

1. Note that this is not equavlent to "if it blew up, they got it wrong.". Sometimes the small, properly calculated risk blows up on you just because you're unlucky - which is different from a miscalculated risk blowing up on you.


No hindsight was required to observe the following:

O-rings are supposed to seal on compression, not expansion.

As it is now, the O-rings are getting blown out of their tracks but still managing to seal the whole assembly quickly enough.

The above unplanned behavior, which is the only thing preventing a hull loss (and a crew loss since there's no provision for escape) is sufficiently iffy that sooner or later we're likely to run out of luck.

(I'd also add about the Columbia loss that NASA had a "can't do" attitude towards the problem they observed of the foam hitting the wing. Hardly a "crew first" attitude.)


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact