
Roger Boisjoly dies at 73; engineer tried to halt Challenger launch - pwg
http://www.latimes.com/news/obituaries/la-me-roger-boisjoly-20120207,0,2248999.story
======
raphman
From the Rogers report [1]:

Engineers at Thiokol also were increasingly concerned about the problem. On
July 22, 1985, Roger Boisjoly of the structures section wrote a memorandum
predicting NASA might give the motor contract to a competitor or there might
be a flight failure if Thiokol did not come up with a timely solution.

Nine days later (July 31) Boisjoly wrote another memorandum titled "O-ring
Erosion/Potential Failure Criticality" to R. K. Lund, Thiokol's Vice President
of Engineering:

 _"The mistakenly accepted position on the joint problem was to fly without
fear of failure and to run a series of design evaluations which would
ultimately lead to a solution or at least a significant reduction of the
erosion problem. This position is now changed as a result of the [51-B] nozzle
joint erosion which eroded a secondary O-ring with the primary O-ring never
sealing. If the same scenario should occur in a field joint (and it could),
then it is a jump ball whether as to the success or failure of the joint
because the secondary O-ring cannot respond to the clevis opening rate and may
not be capable of pressurization. The result would be a catastrophe of the
highest order-loss of human life."_

Boisjoly recommended setting up a team to solve the O-ring problem, and
concluded by stating:

 _"It is my honest and very real fear that if we do not take immediate action
to dedicate a team to solve the problem, with the field joint having the
number one priority, then we stand in jeopardy of losing a flight along with
all the launch pad facilities."_

[1] <http://history.nasa.gov/rogersrep/v1ch6.htm>

~~~
lutorm
_Boisjoly recommended setting up a team to solve the O-ring problem_

And there was a team working on it. IIRC, a redesign of the joint was being
worked on. But no one suggested, before the evening before the launch, that
launches should be stopped until the redesign was completed.

~~~
ScottBurson
No. But they did suggest that launching in unusually cold weather was a very
bad idea.

------
dmethvin
The fate of the Challenger engineers is depressing because they were
essentially kicked out of their profession for knowing the right and moral
answer. Thankfully, very few of us will be faced with these honestly life-or-
death technical decisions.

I think there are parallels to right-or-wrong issues that we face in our own
industry on a regular basis, for example:

[http://johnnye.net/articles/ios-apps-want-your-
contacts,-not...](http://johnnye.net/articles/ios-apps-want-your-
contacts,-not-just-path.html)

Are phrases like "standard industry practice" and "covered by the click-
through license" today's weasel words that rationalize us implementing immoral
management demands?

~~~
rdtsc
> I think there are parallels to right-or-wrong issues that we face in our own
> industry on a regular basis, for example:

If our software deals with privacy and personal information then it can even
become life or death in some countries. A dissident in a brutal authoritarian
regime could lose his life if the software has a bug in it, for example.

The other area were moral choices take place is in licensing and patents. Can
be on both large scales (like we see with big companies use busybox for their
platform) or smaller scales (copying code from one project to another without
even giving credit).

~~~
bwarp
Fine historical example:

<http://en.wikipedia.org/wiki/IBM_and_the_Holocaust>

Please don't downvote me - I'm not invoking Godwins Law here ;-)

~~~
rdtsc
Yeah that is a good point. One can argue that technology is a tool and it can
be used for evil. But with this stuff:

"without IBM's machinery, continuing UPKEEP and SERVICE, as well as the supply
of punch cards, whether located ON-SITE or off-site, Hitler's camps could have
never managed the numbers they did" [emphasis mine]

The idea is basically:

So there were not pocket calculators that they bought and then deployed in
concentration camps and IBM can claim complete ignorance over the use. These
machines need constant & professional service. Having them onsite meant that
IBM sent technicians there probably. One had to wonder what those technician's
response do the 24/7 billowing crematorium and emaciated bodies were.

However, on the other side (and I have not read the book, just looked at the
topic many years ago superficially) I am not aware of any IBM engineers'
stories about an actual site visit... So I don't know what to think and how
guilty IBM really is...

~~~
Symmetry
I don't know, I think that IBM per se was really in a position to know how the
census machines they had given to Germany were actually being used. Remember
that this was before the days of satellite communication, and as part of their
war strategy the British cut Germany's telegraph cables (just like they had in
WWI) and intercepted packets of mail between the US and Germany whenever they
could, and generally did a pretty good job of it[1]. I'm happy to believe that
people who were nominally IBM employees were servicing the machines, but if
IBM headquarters wasn't being told what was happening, and if the employees
faced getting shot if they didn't do what they were told, I'm not sure how
much you can say that IBM knew in any meaningful sense.

[1] This was a point of contention between the UK and US before the fall of
France.

------
cellularmitosis
While watching the new iPad UI course from CMU (
[http://www.cmu.edu/homepage/computing/2012/winter/ipad-
cours...](http://www.cmu.edu/homepage/computing/2012/winter/ipad-course.shtml)
), the lecturer touched on what happened with the Challenger, using it as an
example in a point he was trying to make.

However, I was struck by wrong he got it. He was making a point about how just
having the data available isnt enough - the modern challenge is to make sense
of big data in a meaningful way through visualization. And to back this point
up, he made it sound like a bunch of engineers didn't look closely at the data
sheet for the rubber used in their o-rings and oopsy! The challenger blew up.
Shrug, Who knew?

I'll tell you who knew. Roger. "The engineers" knew perfectly well what was
going to happen. It was management who refused to listen to them and launched
anyway.

Feyman's account of his involvement in investigating this disaster in "the
pleasure of finding things out" was excellent. In one exercise, he bad the
engineers and management write down what they honestly thought the failure
rate of the shuttle was. Management quoted 1 in 100,000, while engineering
quoted 1 in 100. The challenger disaster was a symptom of a complete breakdown
in communication between engineering and management at NASA.

~~~
krschultz
One of my college mechanical engineering professors was a former NASA engineer
involved in determining risk and investigating problems. He gave a gave a
lecture & led a discussion on the Challenger O-Ring case (and I have since
read a lot of the reports). As you mentioned, the CMU lecturer wasn't even
close to the real story. But it also ins't so simple as to say 'managment
didn't listen'.

The engineers knew what was going to happen with a fairly high degree of
certainty - enough that the launch shouldn't hvae happened. Then they turned
to management and tried to present their findings. Clearly the business guys
didn't want to be the ones to hold up the launch, they had a strong incentive
not to. So the engineers _had_ to convince them why the company should
basically stick its neck out as one of hundreds and hundreds of vendors and
stop this launch. The engineers didn't do the best job of communicating the
certainty of the problem, the magnitude of the problem, etc. So it was a
little bit of both.

That lecture was probably 8 years ago for me, but it definitely left a lasting
impression. Today I work on systems that truthfully are more complicated than
the space shuttle and also have a lot more lives at stake. "Intellectual
honesty" is a phrase I live by. There have been times I fucked up some math
and knew we needed to fix something - even though it would money and delay,
but I stuck to my guns and made sure it happened. The alternative is too
likely to lead to disaster. It keeps me up at night, and those are the near
misses. I can't imagine what that guy went through knowing they almost stopped
it, but didn't.

~~~
ak217
Interesting perspective, thanks for sharing. Can you say what you work on, or
at least give us an idea? Systems more complicated than the space shuttle -
that sounds really interesting and I'm not sure which ones those are.

~~~
krschultz
I'd rather not share on HN because of how often it gets crawled by Google. You
can generally find it through my LinkedIn which is easy enough to find from my
HN handle.

------
mcdillon
_"When he was pressed by NASA the night before the liftoff to sign a written
recommendation approving the launch, he refused, and later argued late into
the night for a launch cancellation. When McDonald later disclosed the secret
debate to accident investigators, he was isolated and his career destroyed."_

This is quite simply astonishing; everything I have ever learned in my
engineering classes said to do what McDonald did and look what happened to
him.

~~~
johngalt
Having integrity would be easy if you never had to fight and the results
always benefitted you. Life is not a fair thing.

Engineers place a lot of emphasis on having the right answer and almost no
emphasis on ensuring their influence. It doesn't matter how right you are if
nobody will follow your direction.

------
asynchronous13
Ronald Reagan was supposed to give a state of the union address a couple days
after the launch. He wanted to use the success of the space program as part of
that speech. While there was no direct order, it's clear that NASA management
felt significant political pressure to push the launch forward or risk reduced
funding from congress. It's clear they screwed up, just wanted to add some
context to the climate when they made these decisions.

~~~
damoncali
This comment needs more upvotes. The political pressure on NASA is extreme.
The only people doted over more than the astronauts were the politicians. Look
into the origins of the Triana satellite and its subsequent fate for a
particularly ridiculous example.

~~~
oz
Seconded. It is a common cognitive failure of us the engineering-minded to
underestimate how important politics and other human factors are.

~~~
3lit3H4ck3r
I know this is a movie. But I think we all get the point.

<http://www.imdb.com/title/tt0086197/>

Gordon Cooper: You boys know what makes this bird go up? FUNDING makes this
bird go up.

Gus Grissom: He's right. No bucks, no Buck Rogers.

------
3lit3H4ck3r
Stunning.

"When the space shuttle Columbia burned up on reentry in 2003, killing its
crew of seven, the accident was blamed on the same kinds of management
failures that occurred with the Challenger. By that time, Boisjoly believed
that NASA was beyond reform, some of its officials should be indicted on
manslaughter charges and the agency abolished."

"NASA's mismanagement "is not going to stop until somebody gets sent to hard
rock hotel," Boisjoly said. "I don't care how many commissions you have. These
guys have a way of numbing their brains. They have destroyed $5 billion worth
of hardware and 14 lives because of their nonsense." "

~~~
afterburner
Yes, nothing has changed. Lessons were not learned. And this was known before
the Columbia disaster, in fact not long after the commission on the
Challenger. It bears striking resemblances to the financial crisis.

------
crikli
"Their pleas and technical theories were rejected by senior managers at the
company and NASA, who told them they had failed to prove their case and that
the shuttle would be launched in freezing temperatures the next morning. _It
was among the great engineering miscalculations in history._ "

Horsepucky: the engineers calculations weren't mis-anything. The historical
record has long since proven that the engineers were repeatedly ignored by
their management and by bureaucrats at NASA.

------
damoncali
I spent 6 years as a structural engineer on space shuttle flights. A few
thoughts:

While I don't know the people involved with Challenger - I was in 6th grade at
the time - it goes well against my own experience that NASA management had
anything but the interests of the crew in mind. To a fault. In fact, your
average NASA employee doted on astronauts like a star struck little girl. What
the crew wanted, the crew got. You could always tell who the astronaut was
when you saw a group walking about the centers - he/she was the one whose
every semi-whimsical comment extracted voluminous and polite laughter from the
others in the group.

I was, however, working when Columbia blew up. In fact, my mission was
supposed to fly on it when it got back. Although sad, I feel comfortable
saying that most of the people working on these things sort of know it's going
to happen from time to time. It wasn't exactly surprising to us or the crew.

Blaming managers and celebrating engineers is overly simplistic. The line is
not as well defined as you might think. I had few - if any (I can't think of a
single one, actually) - managers (either contractors or NASA employees) who
were not experienced engineers.

The safety rules for a shuttle _payload_ , let alone the actual orbiter, are
voluminous and arcane. It is the primary reason that very little new
technology comes out of the manned space flight program. Everything new is
considered too dangerous because it hasn't been flown before.

This stuff is _insanely dangerous_. It is pretty damn easy to come up with a
way some piece of hardware you're working on could kill someone. The
complexity is enormous. The number of people involved is in the thousands, and
they're spread all over the country. Different centers have different rules.

As a result, you make life-and-death decisions literally every day. It's not
such a big deal, because there is a lot of formal process in place to make
sure it gets done right. The the "standards" are what keep space flight as we
know it as safe as it is. Are they or the processes by which they are enforced
perfect? Hell no.

The system failed. People failed. But we knew this would happen, and we did it
anyway because it's the price of exploring the frontiers. We learned from
Challenger. We learned from Columbia. We will learn from the next catastrophic
failure. NASA isn't perfect. In fact, you might say the bloated organization
and government involvement makes this sort of thing inevitable. But I bet the
small privateers exploring manned space flight will run into their own
challenges.

Basically, what I'm saying is that we need to keep this in a larger
perspective. Obsessing over one failure in what is a centuries-long quest is
not helpful. Dissect it, learn from it, and move on.

~~~
kahawe
Different teams working on modules is a very different animal than one
engineer saying "this item is going to blow up" for years and obviously
everyone ignoring him... I find it particularly shocking how your answer
suggests a strong "well, shit happens" attitude when clearly the potential AND
a strong reason to make things better was right there.

What would really be interesting is why he "failed to make his case" according
to executives.

~~~
damoncali
What you're not seeing is that pretty much everything you work on has some
risk of failure, and much of it could be catastrophic. Sorting through all of
that is not easy. Yes, in this case, there were systemic and human failures.

It just goes to show you that even when everyone is paying attention, things
still go wrong. Some people heard him and made the call that it was still
safe. They were wrong. That, unfortunately, is the state of the art today (or
it was back in the 80's). The alternative is to stay on the ground.

But yes, shit does happen, and nobody climbing abord the orbiter is under any
illusion that it is a safe thing to do.

~~~
TWAndrews
If you've read Fenyman's appendix to the report on the accident and
investigation (<http://www.ralentz.com/old/space/feynman-report.html>), I
don't see how you can possibly believe that the decisions made around the
Challenger launch were made with the right process.

~~~
damoncali
I have read it, and I do think there are defects in the processes. But what to
you _do_ about it? I'm sure there are many more unknown vulnerabilities in the
orbiter that were never found out, but you keep trying and fixing.

~~~
mistermann
Your willfully ignorant (and I don't mean that in a crude, insulting way)
responses here lead me to think there remains some very serious cultural
issues within NASA.

~~~
damoncali
I think you've read something I didn't write. What you call willful ignorance
I call a realistic assessment and acceptance of the risks of pioneering space
flight.

Nobody is forced into an orbiter - people BEG for the opportunity. We gave
them that opportunity, working in good faith to the best of our ability.
Sometimes it doesn't work out. Sometimes things break. Sometimes people screw
up. We all know the risks.

You can sit on the porch with a near 100% safety record or you can give it a
try. Your choice.

~~~
kahawe
Actually I have to say I am with "mistermann" on this... you make it sound
like there is no other way. I can totally accept and understand that it cannot
be all that safe to sit you on tons of rocket fuel, fire you into the oxygen-
less and freezing depth of space and then hope you somehow make it onto
another planet AND then do the same stunt from there back to earth. I get it,
I can also understand the trade-off between "making it 100% safe" and
"otherwise we'd never get lift-off".

What I cannot understand is: an unknown, unforseen contingency is a completely
different thing than an engineer pointing out "this WILL fail, it will blow up
and I have proof" and there really should not be any excuse for ignoring a
warning like this... yes, you cannot make it 100% safe but you should at least
aim to make it as safe as humanly possible given your current level of
technology and knowledge... so, in my book overriding an engineer saying "this
WILL fail and it'll blow up" is actually negligent man slaughter. When I get
into my car in the morning and don't care that the brakes aren't working even
my mechanic told me my brake lines were cut, what would you call that?

~~~
lutorm
While this sentiment is understandable, it's not justified unless you know how
many times people said "this will fail" and it didn't. We only have definite
data on this one statement. You can _not_ from that data conclude (I'm not
saying there isn't other data) that this was negligent. If every engineer who
disagreed with something said "this thing is going to blow up", _eventually_
one would be right. But you can _not_ then infer that _that_ individual was
any different than the others and that people should have known this. It's the
"monkeys on typewriters" fallacy.

~~~
drucken
This is science and engineering not statistics. It is not a numbers game or
"monkeys on typewriters" or how many bug reports we can file on the same issue
to get said issue fixed!

At the end of the day, if even ONE person demonstrates scientific or
engineering knowledge that shows a serious safety concern, then why would you
actively _choose_ to ignore it. Period.

NASA management - whether it be by organisational process and or personally
identifiable decision making - failed in their responsibilities in spectacular
fashion!

~~~
damoncali
While I agree (especially with the last sentence), I would point out that the
engineering behind these problems is rarely black and white, and hindsight
tends to make it look more so than it is.

I do not believe that if someone knew with 100% certainty that Challenger
would blow up that it would ever have launched. The trouble came in in the
judgment of that risk. In this case, from what I've read, they got it wrong -
very wrong[1].

You can argue about how certain they have to be, or how negligent people were
to ignore estimated failure probabilities of whatever magnitude. But it's not
like someone says, "this will blow up 85% of the time, period. Make a call."
It's more subtle, complex, and less concrete than that.

1\. Note that this is not equavlent to "if it blew up, they got it wrong.".
Sometimes the small, properly calculated risk blows up on you just because
you're unlucky - which is different from a miscalculated risk blowing up on
you.

~~~
hga
No hindsight was required to observe the following:

O-rings are supposed to seal on compression, not expansion.

As it is now, the O-rings are getting blown out of their tracks but still
managing to seal the whole assembly quickly enough.

The above unplanned behavior, which is the only thing preventing a hull loss
(and a crew loss since there's no provision for escape) is sufficiently iffy
that sooner or later we're likely to run out of luck.

(I'd also add about the Columbia loss that NASA had a "can't do" attitude
towards the problem they observed of the foam hitting the wing. Hardly a "crew
first" attitude.)

------
arto
The story in more detail at NPR:

[http://www.npr.org/blogs/thetwo-
way/2012/02/06/146490064/rem...](http://www.npr.org/blogs/thetwo-
way/2012/02/06/146490064/remembering-roger-boisjoly-he-tried-to-stop-shuttle-
challenger-launch)

[...] "We all knew what the implication was without actually coming out and
saying it," a tearful Boisjoly told Zwerdling in 1986. "We all knew if the
seals failed the shuttle would blow up."

Armed with the data that described that possibility, Boisjoly and his
colleagues argued persistently and vigorously for hours. At first, Thiokol
managers agreed with them and formally recommended a launch delay. But NASA
officials on a conference call challenged that recommendation.

"I am appalled," said NASA's George Hardy, according to Boisjoly and our other
source in the room. "I am appalled by your recommendation."

Another shuttle program manager, Lawrence Mulloy, didn't hide his disdain. "My
God, Thiokol," he said. "When do you want me to launch--next April?"

These words and this debate were not known publicly until our interviews with
Boisjoly and his colleague. They told us that the NASA pressure caused Thiokol
managers to "put their management hats on," as one source told us. They
overruled Boisjoly and the other engineers and told NASA to go ahead and
launch.

"We thought that if the seals failed the shuttle would never get off the
launch pad," Boisjoly told Zwerdling. So, when Challenger lifted off without
incident, he and the others watching television screens at Thiokol's Utah
plant were relieved.

"And when we were one minute into the launch a friend turned to me and said,
'Oh God. We made it. We made it!'" Boisjoly continued. "Then, a few seconds
later, the shuttle blew up. And we all knew exactly what happened."

------
kevinalexbrown
What strikes me about the episode in terms of "general life lessons" isn't
just "Listen to the engineers" (you should, though); it's that under the
pressure to Get Stuff Done, there's a huge temptation to brush legitimate
concerns under the rug. "These guys tell me this shuttle is unsafe, but space
launch is never completely safe" --> "These guys tell me this user data isn't
secure but no software is completely safe." Now that the newness of the space
program has worn off a bit, it's easy to say "why didn't they just delay the
launch?" but back in the day, it was an issue of national pride, and the
managers, simple-minded as they may have been, were under an extreme amount of
pressure to pull the launch off.

I guess it's just worth remembering that even if you're under pressure to
ship, launch, or publish, if the guys whose job it is to know tell you to
reconsider, you probably should.

~~~
kahawe
I guess this is also a problem of different goals, according compensation and
what you have to pay in case you were wrong... those guys having to push
things through and make it happen in time, under budget get paid for just that
- and not for having prevented a disaster. Just like bank managers make fat
profits from taking on huge risks and then don't have to pay for it when it
explodes in their face and nothing happens there; no person is responsible. It
just happened.

(Another problem probably is communication between engineers and (project)
managers - you can not be blatantly un-subtle enough... if the worst case is
complete data loss then you paint them that picture in no subtle terms; if the
worst case is catastrophic equipment failure, explosions, launching pad
annihilated and several lives including the millions spent training them lost,
then that's what you tell them and document in very direct terms.)

------
SoftwareMaven
You can go through any catastrophe in a complex system and piece together a
chain of events that show how "obvious" it was that it was going to happen.
What you miss are all the chains that say _every_ complex system is going to
end in catastrophe, because when they don't end, nobody looks. It is kind of
an anti-survivor bias.

It is also why it is a bad idea to make policy changes strictly off the cause
of a single failure, and that is where things like commissions should help:
you can move the focus to looking at the entire problem set for weaknesses
instead of just leaving it with "make a better o-ring".

------
sheepthief
Here's the memo in which Boisjoly warns of a potential "catastrophe of the
highest order - loss of human life."

[http://www.lettersofnote.com/2009/10/result-would-be-
catastr...](http://www.lettersofnote.com/2009/10/result-would-be-
catastrophe.html)

------
mithaler
For more background on the engineer-manager disconnect that led to the
Challenger disaster, it's worth reading about Richard Feynman's famous
appendix to the Rogers report (even Wikipedia's summary is a fascinating
read):

[http://en.wikipedia.org/wiki/Rogers_Commission_Report#Role_o...](http://en.wikipedia.org/wiki/Rogers_Commission_Report#Role_of_Richard_Feynman)

------
bconway
What I find most interesting is the contrast this article's discussion paints
with the one we saw only 2 weeks ago, _How Much Is an Astronaut's Life Worth?_
[1]. Some of the highly-rated HN comments included:

 _Space is dangerous. We should stop pretending it can be made "safe". It just
gives politicians something to wag their tongues at when something inevitably
goes wrong._

 _The problem here is that NASA is a political agency, not a scientific one.
Each year, elected politicians sit down and decide how much they're going to
get._

 _This provides thoughtful perspective on policy trade-offs. As Thomas Sowell
has written, "The first lesson of economics is scarcity: There is never enough
of anything to satisfy all those who want it. The first lesson of politics is
to disregard the first lesson of economics."_

[1] <https://news.ycombinator.com/item?id=3518559>

------
snowpolar
I don't know science so please pardon me if my comment makes no sense in this
context.

What if by some stroke of luck, Challenger was extremely lucky enough to not
explode on that 1st launch...What would happen to these righteous people such
as Roger? Condemned by the people around them as some over worrying, insane
self righteous people who thought they know everything. If you get what I
mean...

It's sad that the challenger explosion happened, but at the same time it helps
to highlight an important issue which may otherwise remain buried.

In software/web development context, it's of course harder to say this thing
is going to blow up because a serious technical debt usually only climbs in
after a much longer time for which by then, the people responsible may have
left . Leaving the next victim to clean it up. This also is a sad state when
it comes to final year projects, where students try to do everything that
could impress to the graders on the outside to get the top grade. While
students who make the extra effort for a clean and maintainble backend did not
get the top grades because the lecturer only looks at the outside during
presentation.

------
ctdonath
In “Visual Explanations” <http://www.edwardtufte.com/tufte/books_visex> Edward
Tufte wrote a must-read analysis of why admonitions to postpone the launch
were ignored.

~~~
gammarator
Boisjoly and others have argued that Tufte's criticism is misguided, as he
misunderstands the engineering: <http://people.rit.edu/wlrgsh/FINRobison.pdf>
. Shorter blog summary here: [http://eagereyes.org/criticism/tufte-and-the-
truth-about-the...](http://eagereyes.org/criticism/tufte-and-the-truth-about-
the-challenger)

------
mcantelon
What are the names of those above Boisjoly who ignored him and made the call
to launch? Good to know the names of the heroes in this story, but good also
to know the names of the villains.

------
jgrahamc
RIP Responsible Engineer

------
InclinedPlane
For the curious, I ran into an excellent document which is an excerpt from an
after the fact risk assessment of the Shuttle System over time:
[http://traffic.libsyn.com/sciencefriday/NASAShuttleRiskRevie...](http://traffic.libsyn.com/sciencefriday/NASAShuttleRiskReview-
excerpt.pdf)

It's quite fascinating and pretty eye opening.

------
dennisgorelik
"Boisjoly could not watch the launch, so certain was he that the shuttle would
blow up."

One thing is to believe that the chance of blowing up is ~1% (which is enough
to prevent launch).

Another thing is to be certain, that it WILL blow up (I assume 90%+
probability here).

If he was so certain, why he could not convince his management?

~~~
coles
It's worth reading the Wikipedia entry on the following report and Feynman's
findings. Anonymous polling of engineers showed they estimated a general
probability of catastrophic disaster in a shuttle launch between 1% and 2%.

~~~
dennisgorelik
So latimes.com's article blew it out of proportions. 2% chance of disaster is
not nearly the same as "so certain was he that the shuttle would blow up."

~~~
blasdel
You're conflating two different estimates: Boisjoly was certain that the
particular launch would fail, and a survey of engineers gave a 1-2% estimate
for failure across all flights.

Both of those predictions turned out to be extraordinarily accurate.

------
shareme
Now for contrast what happens when those lower field engineers are management
in the SpaceX case?

Hopefully no loss of life ever..

~~~
hga
That's unrealistic. I'd settle for no _stupid_ , reasonably avoidable loss of
life.

