Hacker Newsnew | comments | show | ask | jobs | submit login

Your willfully ignorant (and I don't mean that in a crude, insulting way) responses here lead me to think there remains some very serious cultural issues within NASA.



I think you've read something I didn't write. What you call willful ignorance I call a realistic assessment and acceptance of the risks of pioneering space flight.

Nobody is forced into an orbiter - people BEG for the opportunity. We gave them that opportunity, working in good faith to the best of our ability. Sometimes it doesn't work out. Sometimes things break. Sometimes people screw up. We all know the risks.

You can sit on the porch with a near 100% safety record or you can give it a try. Your choice.

-----


Actually I have to say I am with "mistermann" on this... you make it sound like there is no other way. I can totally accept and understand that it cannot be all that safe to sit you on tons of rocket fuel, fire you into the oxygen-less and freezing depth of space and then hope you somehow make it onto another planet AND then do the same stunt from there back to earth. I get it, I can also understand the trade-off between "making it 100% safe" and "otherwise we'd never get lift-off".

What I cannot understand is: an unknown, unforseen contingency is a completely different thing than an engineer pointing out "this WILL fail, it will blow up and I have proof" and there really should not be any excuse for ignoring a warning like this... yes, you cannot make it 100% safe but you should at least aim to make it as safe as humanly possible given your current level of technology and knowledge... so, in my book overriding an engineer saying "this WILL fail and it'll blow up" is actually negligent man slaughter. When I get into my car in the morning and don't care that the brakes aren't working even my mechanic told me my brake lines were cut, what would you call that?

-----


While this sentiment is understandable, it's not justified unless you know how many times people said "this will fail" and it didn't. We only have definite data on this one statement. You can not from that data conclude (I'm not saying there isn't other data) that this was negligent. If every engineer who disagreed with something said "this thing is going to blow up", eventually one would be right. But you can not then infer that that individual was any different than the others and that people should have known this. It's the "monkeys on typewriters" fallacy.

-----


This is science and engineering not statistics. It is not a numbers game or "monkeys on typewriters" or how many bug reports we can file on the same issue to get said issue fixed!

At the end of the day, if even ONE person demonstrates scientific or engineering knowledge that shows a serious safety concern, then why would you actively choose to ignore it. Period.

NASA management - whether it be by organisational process and or personally identifiable decision making - failed in their responsibilities in spectacular fashion!

-----


While I agree (especially with the last sentence), I would point out that the engineering behind these problems is rarely black and white, and hindsight tends to make it look more so than it is.

I do not believe that if someone knew with 100% certainty that Challenger would blow up that it would ever have launched. The trouble came in in the judgment of that risk. In this case, from what I've read, they got it wrong - very wrong[1].

You can argue about how certain they have to be, or how negligent people were to ignore estimated failure probabilities of whatever magnitude. But it's not like someone says, "this will blow up 85% of the time, period. Make a call." It's more subtle, complex, and less concrete than that.

1. Note that this is not equavlent to "if it blew up, they got it wrong.". Sometimes the small, properly calculated risk blows up on you just because you're unlucky - which is different from a miscalculated risk blowing up on you.

-----


No hindsight was required to observe the following:

O-rings are supposed to seal on compression, not expansion.

As it is now, the O-rings are getting blown out of their tracks but still managing to seal the whole assembly quickly enough.

The above unplanned behavior, which is the only thing preventing a hull loss (and a crew loss since there's no provision for escape) is sufficiently iffy that sooner or later we're likely to run out of luck.

(I'd also add about the Columbia loss that NASA had a "can't do" attitude towards the problem they observed of the foam hitting the wing. Hardly a "crew first" attitude.)

-----


That would be the "people screw up" part. Do you have a cure for that?

-----




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: