Hacker News new | past | comments | ask | show | jobs | submit login

I spent 6 years as a structural engineer on space shuttle flights. A few thoughts:

While I don't know the people involved with Challenger - I was in 6th grade at the time - it goes well against my own experience that NASA management had anything but the interests of the crew in mind. To a fault. In fact, your average NASA employee doted on astronauts like a star struck little girl. What the crew wanted, the crew got. You could always tell who the astronaut was when you saw a group walking about the centers - he/she was the one whose every semi-whimsical comment extracted voluminous and polite laughter from the others in the group.

I was, however, working when Columbia blew up. In fact, my mission was supposed to fly on it when it got back. Although sad, I feel comfortable saying that most of the people working on these things sort of know it's going to happen from time to time. It wasn't exactly surprising to us or the crew.

Blaming managers and celebrating engineers is overly simplistic. The line is not as well defined as you might think. I had few - if any (I can't think of a single one, actually) - managers (either contractors or NASA employees) who were not experienced engineers.

The safety rules for a shuttle payload, let alone the actual orbiter, are voluminous and arcane. It is the primary reason that very little new technology comes out of the manned space flight program. Everything new is considered too dangerous because it hasn't been flown before.

This stuff is insanely dangerous. It is pretty damn easy to come up with a way some piece of hardware you're working on could kill someone. The complexity is enormous. The number of people involved is in the thousands, and they're spread all over the country. Different centers have different rules.

As a result, you make life-and-death decisions literally every day. It's not such a big deal, because there is a lot of formal process in place to make sure it gets done right. The the "standards" are what keep space flight as we know it as safe as it is. Are they or the processes by which they are enforced perfect? Hell no.

The system failed. People failed. But we knew this would happen, and we did it anyway because it's the price of exploring the frontiers. We learned from Challenger. We learned from Columbia. We will learn from the next catastrophic failure. NASA isn't perfect. In fact, you might say the bloated organization and government involvement makes this sort of thing inevitable. But I bet the small privateers exploring manned space flight will run into their own challenges.

Basically, what I'm saying is that we need to keep this in a larger perspective. Obsessing over one failure in what is a centuries-long quest is not helpful. Dissect it, learn from it, and move on.




I took a professional ethics class in college, and the professor was a personal friend of Roger. The whole class was about Challenger, and the incredible failure of judgement around it's demise. Roger came into one of our classes and spoke to us late in the semester.

After listening to tapes of the trials, interviews, reading transcripts, and reading articles it was very apparent to me that this was a failure of management. The lead engineer, during the discussions of whether to launch the night before, was arguing that the engineering evidence did not support a launch under the temperature conditions projected for the following morning. He was told by Morton-Thiokol business reps to "take off your engineer hat, and put on your manager hat".

Evidence points to this failure happening because NASA needed a PR boost for funding, and M.T. wanted to continue doing business with them delivering solid-state boosters.

Because Roger Boisjoly spoke to Congress during the hearings he was black-listed from his industry. At no point during the decisions leading up to that disaster did good engineering practices that could have prevented this destruction come into play.


Thanks for your perspective.

> The system failed. People failed. But we knew this would happen, and we did it anyway because it's the price of exploring the frontiers. ... you might say the bloated organization and government involvement makes this sort of thing inevitable. But I bet the small privateers exploring manned space flight will run into their own challenges.

I bet they will do better, and go farther. They will know a disaster like that will likely ruin their company, so they will make damn sure that the communication process between managers and engineers doesn't break down, and that the process complexity is kept in check.

You're describing (and excusing) a bloated and dysfunctional system that sprang up around the need to manage the complexity of the space shuttle. People tried to fix the organization after Challenger, but the fact that Linda Ham stopped the request for imagery as described by CAIB shows that they failed or it reverted. And as your attitude shows, there is a bit of a fatalist perspective ("bloated organization and government involvement"). In the long term, it has to be fixed, or stuff will keep blowing up.


I'm not excusing it. And my hope is that you are correct on the smaller, leaner private launches. (I suspect, however, that they will kill some people too, and sadly, those people may not be as aware of the risks as astronauts are).

Some day, NASA will likely be seen as strange and primitive. That insanely high risk of death is the current state of the art, however. It will get worked out of the system over a very long period of time - but only if we don't lose the nerve to launch these things because someone might die.

How many people died crossing oceans back in the day because someone screwed up? Shit - they still die on ships, and in cars and in planes. Space ships are going to blow up, crash, and fail. It's just life.


It's interesting that you mention planes. Boeing and Airbus (with FAA's help I guess) have figured out a way to build astonishingly safe planes. Yes, their systems are simpler and get a lot more use, but the fact that there has been exactly one hull loss for a 777 with no fatalities in 20 years with 1000 planes might also be telling us something about them getting the organization right.


A safe 777 is hard to build, but a safe orbiter is an order of magnitude harder, with respect to the range of velocity, temperature, and pressure that the vehicle endures.

It's not a totally fair comparison to say "Boeing can do it, why can't NASA and their contractors"

With current technology, it seems obvious that if we want to (further) develop manned spaceflight, we should launch unmanned orbiters with crash-test dummies and telepresence surrogates, and keep doing so until we've established a safety record.


> In fact, your average NASA employee doted on astronauts like a star struck little girl. What the crew wanted, the crew got.

This is the question I've always had about the Challenger, but never heard addressed: How much of a factor was the crew's opinion considered to be? I strongly suspect that the astronauts themselves exerted informal but real pressure to prefer flying, since it would be their moment in the sun.


I can't opine on the crew's input to launch decisions because I didn't work in that area. I suspect that they have little say about it.

I did work on astronaut EVA training, however. In that case, the astronauts were king. No matter how silly the request, their wishes were catered to - but only in areas of usability, not safety. It turns out it's really difficult to connect wires in space, for example - small things matter a lot. I can remember at least one case where what they wanted was just plain dumb. But we did it anyway. Usually, though, they were pretty good about that sort of stuff.


You raise two main themes here: NASA/space travel has grown into a Byzantine set of rules, and also that it is an accepted cost of doing business that you'll lose lives.

Do you believe that we'd be better served by simpler (but potentially much more dangerous) craft flown by private industry? If not, why is the NASA approach better, seeing as how it is mired in both politic from without and bureaucracy/process from within?


Yes - I believe a loss of life is acceptable and inevitable. The state of the art is currently the bloated NASA system. My hope is that a leaner, private approach will be more effective and safer. I'm not sure I'm willing to make a prediction on whether that will be true or not. For one thing, NASA doesn't have the problem of profit to be concerned about.


I wish I could upvote your comments more than once. It's interesting that this forum that celebrates taking extreme (financial) risks in exchange for possibly great (financial) reward has a hard time with NASA scientists, engineers, and astronauts taking calculated risks. 100% safety is not a productive strategy.


Agreed.

Hell, even 20-50% safety isn't necessarily a productive strategy, if you pay too much for it.

I think the main trick is to reduce the unit cost/training investment in astronauts so that we can send up more, and making very cheap vehicles to get them there, so that we'll not have to worry about losing a big investment when (not if) something goes wrong. Putting hundreds of millions of trained meat into billions of dollars worth of aerospace tech is not sustainable.

If it would get me to the moon with a 1 in 3 chance, hell, give me a banana and call me Albert VII.


Extreme financial risk might or might not kill people.

Space shuttles exploding will definitely kill people.

We're human. Fellow humans died in this pursuit of space exploration. Is it inevitable? Maybe. Probably.

Does it still hurt? Hell yes.


"Extreme financial risk might or might not kill people.

Space shuttles exploding will definitely kill people."

Don't make such a big jump from the first to the second. You could have also said, "Extreme space exploration risk might or might not kill people." Fact is, in both finances and space exploration, in rare cases it has led to the death of those involved (directly or indirectly). But those cases are rare and the rewards are so great, and so we press on.


Why would we allow a private craft to be more dangerous? They probably will be, but because they will be soon be subject to even more strict rules, and sneaky things will have to be done to compete in that environment (see the financial system).

When the person making the rules (the government) is no longer subject to the rules, but held responsible when they're broken, do you think they'll get looser or stricter?

And now not only does your company have to obey safety laws or whatever else applies, you've got insurance companies, shareholder lawsuits, and so on.

The Challenger/Columbia astronauts are (correctly) treated as heroes who sacrificed for the greater good. The Space Inc. astronauts who die will be considered victims of corporate negligence.


I have full confidence in SpaceX having safety at the fore-front of their manned spaceflight program. If they have a fatal mishap and loose the crew, the consequence is that they would effectively be out of business. They would go Bankrupt.

Loosing 7 lives is a tragedy. For the CEO, knowing loosing those lives will cost the company everything, they are going to pay more attention to safety, than a government bureaucracy.


After the Columbia disaster, I confess I was a bit shocked the heat shield was never examined in orbit to assess the damage that that could occur during launch. Would it be that hard to do?


I don't honestly know. I didn't work on the orbiter - I did the Hubble Space Telescope servicing mission payloads. I don't know what techniques they've figured out after the accident, but I'm not sure there was a reliable way to go out and look at the time. The shuttle's arm is only so long, and even if you could put an astronaut in postion to look, the EVA time required would likely be prohibitive. You'd need some way to inspect the underside of the orbiter from a satellite, earth, or some other vantage point. I'm sure google can tell us what they'd do today to help mitigate this kind of failure.


IIRC, two ways to inspect the orbiter's underside were developed - an inspection boom that could be attached to the end of the arm equipped with sensors (visual and laser), and a maneuver procedure before docking with the ISS - the orbiter would make a full rotation allowing it to be photographed from the ISS. During one of the first post-Columbia flights, a problem - a spacer which protruded from its proper position between two tiles was removed during an EVA. This rotation maneuver was the most shocking to me - as it showed it never occur to anyone just to rotate the shuttle so it could be inspected with a pair of binoculars. Such an inspection could be conducted as early as STS-63.

In a sense, every shuttle was an X-plane - it was as much a research vehicle as a commercial transport to LEO. One builds shuttles to learn how to better build shuttles and, in order to do that, learn as much as possible.

BTW, I envy you (in a good way, of course) more than a little. Working for NASA is a really cool thing.


Going to extreme lengths they could have examined the heat shield by repurposing spy satellites. But there was very little they could have done about it anyway.


My understanding is that the effort wasn't extended with the understanding that the crew were doomed regardless of what information was gathered.

My view is that even if the crew were doomed, gathering additional information in advance of reentry would have allowed for a better understanding of circumstances, better post-disaster modeling of what went wrong and how the orbiter failed (both in the launch-time foam strike, and in the reentry heat-shield penetration and structural failure).

Whether or not to inform the crew is yet another decision. Astronauts are aware that theirs is a highly risky venture, though with a low sample size, the specific odds are somewhat uncertain, though on the order of 4:100 per human space flight. If you're going to go on a space mission, you'd better be prepared to die.

If NASA refrained from assessing strike damage on those grounds, I feel a grave error was committed.


An engineer started arranging spysat imaging through "back channels" but was struck down by management - he/she didn't have the guts to escalate it into an official thing.


perhaps the culture change because of the challenger disaster...nothing like having the shuttle blow up, to make you reevaluate your priorities for safety


Different teams working on modules is a very different animal than one engineer saying "this item is going to blow up" for years and obviously everyone ignoring him... I find it particularly shocking how your answer suggests a strong "well, shit happens" attitude when clearly the potential AND a strong reason to make things better was right there.

What would really be interesting is why he "failed to make his case" according to executives.


The engineering team had approximately 3 hours notice prior to a teleconference to make a presentation. The telecon occurred the evening before launch, with a midnight deadline for the go / nogo decision to be made.

Due to the short timescale to build their presentation, they re-used info from existing presentations. Unfortunately, the same info had previously been used to demonstrate why their o-rings were safe. The NASA people basically said, 'last time you showed me this graph it meant things were safe, this time it means things are dangerous. What gives?'

The decision makers were looking for excuses to move forward, and that gave them an excuse to ignore the warnings.


What you're not seeing is that pretty much everything you work on has some risk of failure, and much of it could be catastrophic. Sorting through all of that is not easy. Yes, in this case, there were systemic and human failures.

It just goes to show you that even when everyone is paying attention, things still go wrong. Some people heard him and made the call that it was still safe. They were wrong. That, unfortunately, is the state of the art today (or it was back in the 80's). The alternative is to stay on the ground.

But yes, shit does happen, and nobody climbing abord the orbiter is under any illusion that it is a safe thing to do.


If you've read Fenyman's appendix to the report on the accident and investigation (http://www.ralentz.com/old/space/feynman-report.html), I don't see how you can possibly believe that the decisions made around the Challenger launch were made with the right process.


I have read it, and I do think there are defects in the processes. But what to you do about it? I'm sure there are many more unknown vulnerabilities in the orbiter that were never found out, but you keep trying and fixing.


There is an obvious difference between "unknown vulnerabilities" and "known vulnerabilities"


Your willfully ignorant (and I don't mean that in a crude, insulting way) responses here lead me to think there remains some very serious cultural issues within NASA.


I think you've read something I didn't write. What you call willful ignorance I call a realistic assessment and acceptance of the risks of pioneering space flight.

Nobody is forced into an orbiter - people BEG for the opportunity. We gave them that opportunity, working in good faith to the best of our ability. Sometimes it doesn't work out. Sometimes things break. Sometimes people screw up. We all know the risks.

You can sit on the porch with a near 100% safety record or you can give it a try. Your choice.


Actually I have to say I am with "mistermann" on this... you make it sound like there is no other way. I can totally accept and understand that it cannot be all that safe to sit you on tons of rocket fuel, fire you into the oxygen-less and freezing depth of space and then hope you somehow make it onto another planet AND then do the same stunt from there back to earth. I get it, I can also understand the trade-off between "making it 100% safe" and "otherwise we'd never get lift-off".

What I cannot understand is: an unknown, unforseen contingency is a completely different thing than an engineer pointing out "this WILL fail, it will blow up and I have proof" and there really should not be any excuse for ignoring a warning like this... yes, you cannot make it 100% safe but you should at least aim to make it as safe as humanly possible given your current level of technology and knowledge... so, in my book overriding an engineer saying "this WILL fail and it'll blow up" is actually negligent man slaughter. When I get into my car in the morning and don't care that the brakes aren't working even my mechanic told me my brake lines were cut, what would you call that?


While this sentiment is understandable, it's not justified unless you know how many times people said "this will fail" and it didn't. We only have definite data on this one statement. You can not from that data conclude (I'm not saying there isn't other data) that this was negligent. If every engineer who disagreed with something said "this thing is going to blow up", eventually one would be right. But you can not then infer that that individual was any different than the others and that people should have known this. It's the "monkeys on typewriters" fallacy.


This is science and engineering not statistics. It is not a numbers game or "monkeys on typewriters" or how many bug reports we can file on the same issue to get said issue fixed!

At the end of the day, if even ONE person demonstrates scientific or engineering knowledge that shows a serious safety concern, then why would you actively choose to ignore it. Period.

NASA management - whether it be by organisational process and or personally identifiable decision making - failed in their responsibilities in spectacular fashion!


While I agree (especially with the last sentence), I would point out that the engineering behind these problems is rarely black and white, and hindsight tends to make it look more so than it is.

I do not believe that if someone knew with 100% certainty that Challenger would blow up that it would ever have launched. The trouble came in in the judgment of that risk. In this case, from what I've read, they got it wrong - very wrong[1].

You can argue about how certain they have to be, or how negligent people were to ignore estimated failure probabilities of whatever magnitude. But it's not like someone says, "this will blow up 85% of the time, period. Make a call." It's more subtle, complex, and less concrete than that.

1. Note that this is not equavlent to "if it blew up, they got it wrong.". Sometimes the small, properly calculated risk blows up on you just because you're unlucky - which is different from a miscalculated risk blowing up on you.


No hindsight was required to observe the following:

O-rings are supposed to seal on compression, not expansion.

As it is now, the O-rings are getting blown out of their tracks but still managing to seal the whole assembly quickly enough.

The above unplanned behavior, which is the only thing preventing a hull loss (and a crew loss since there's no provision for escape) is sufficiently iffy that sooner or later we're likely to run out of luck.

(I'd also add about the Columbia loss that NASA had a "can't do" attitude towards the problem they observed of the foam hitting the wing. Hardly a "crew first" attitude.)


That would be the "people screw up" part. Do you have a cure for that?


You are leaving out data. The same engineers also, during that time, agreed to decisions that the problem had been fixed. Apparently, there was an established mechanism for any engineer working on the shuttle to file some official "bug report" that then would have required a thorough investigation. None of the engineers did, all concerns were voiced through informal channels.


Part of the conclusions of the Challenger post-accident report was that engineers were discouraged from filing such "official bug reports." Informal reports made in a briefing did not require investigation, so they were not discouraged in the same way.

When I visit a NASA center, there are posters up all over saying "If it's not safe, say so." Part of the reason for the 2-year grounding of the Shuttle fleet, post-Challenger, was to put in place a stronger culture of safety at NASA.

A commenter mentioned the false dichotomy of "engineers vs. managers". It's a hard call, as an engineer, to disappoint a manager (or a whole line of managers, all the way up) with a call to solve a possible problem. Civil engineers may be more used to this sort of accountability.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: