Hacker News new | past | comments | ask | show | jobs | submit login
Death by PowerPoint: The slide that killed seven people (mcdreeamiemusings.com)
387 points by newzisgud 8 days ago | hide | past | web | favorite | 127 comments





No, the NASA officials are the ones that killed seven people. Is there audio from the presentation? If you are meeting with some group about something that is life or death, you sit your butt down and ask questions until you get the actual situation. They didn't do it for Challenger and they failed to do it for Columbia. You are in charge, the decision is yours and you are responsible.

On a side note, the whole first paragraph is just plain insulting to anyone who is making a life or death decision. If that is the attitude of anyone in the meeting, then they shouldn't have that job.


There’s definitely audio from all those meetings. You can hear the mission manager plow right through the issues with bureaucratic ease. They’ve seen something similar before and even though it was unexplained, because they’ve seen it, they think it’s a known issue.

You want to blame her for it and shake her out of it as you hear her say it, but it wasn’t malicious. I believe the mission manager even had a spouse who was an astronaut, so it’s obviously not like they don’t care. I’ve always found it fascinating how organizational structure and pressure can take really brilliant and motivated people and beat them into making such poor decisions.

It’s really important to remember that these people are not idiots. It’s literally a room of rocket scientists and space shuttle mission managers. It’s so easy for us all to say here that it was so easy to see it happening and that anyone who didn’t ask the right question was a moron, but these structures take on a life of their own. If I had it to start all over again, studying that would make a fascinating and potentially rewarding career.


For the past two years I have worked in the local emergency room as a technician working on getting some clinical experience. Let me tell you, in the constant face of life and death, you really just develop a kind of "danger fatigue" and even the most critical moments become somewhat prosaic.

You begin to develop a false sense of security after nobody really does from a gunshot wound. Then, someone septic comes in, and "seems" fine, and they're dead in a few hours.

I'm not sure what this kind of logical fallacy is, but I suspect it's similar in a government environment, where you're constantly at RED ALERT. The risk of danger just seems overstated, even when it isn't.


A very similar phenomenon happens in aviation—we call it complacency. Thousands of successful takeoffs in a row make it hard sometimes to remember that each one is a completely independent event.

The way I fight it is by explicitly reminding myself that just because something worked yesterday, that doesn't mean I can skip a step today or let my guard down at any point.

It really does take deliberate thought though. Funny how the brain works.

(The upside is that every successful takeoff becomes a delightful surprise!)


Same thing for rock climbing, and complacency born from routine is exactly the problem: you tie knots that your life will depend on every day, thousands of times, and then stuff like this happens to some of the most skilled and experienced people:

- get distracted while tying the rope to your harness and leave the knot unfinished, fall 20 meters from the top of the climb (Lynn Hill, by sheer luck only broke her foot and elbow)

- use a slightly unusual rope setup and when preparing to be lowered, tie in on the wrong side of the anchor, fall 14 meters onto rock (Rannveig Aamodt, broke her spine, pelvis and ankles)

- have your partner point out damage on your harness, shrug it off because there's plenty of safety margin, continue climbing for 3 days in a manner that puts repetitive abrasion on exactly that part of the harness, have it snap while rapelling and fall 150 meters to your death (Tood Skinner)


Sometimes there are comparatively simple lifehacks you can establish to prevent complacency from leading to problems.

A little example from my life is forgetting to secure the buckle of my motorcycle helmet. Once I've done everything else and gotten my gloves on, it's a pain to take them off again to buckle it up if I forget, and a few times I just shrugged it off and rode anyway.

Then an instructor suggested he always follows exactly the same procedure each time, to the point that he always even puts the same glove on first. It made me wonder if I should do that, and in thinking it through I realised if I changed my order to always buckle up before putting my glasses back on, I'd never forget. There's no way I'd ride off without the glasses (because I can't see without them!), so if that step can only come _after_ buckling up, then there's now no way I'd ever not buckle it up either.


Also on the roads. So many people drive round a blind corner at high speed because they do the journey every day and the road is always empty. Until one day it isn't.

Not quite what you're talking about, but very much related is normalization of deviance:

What these disasters typically reveal is that the factors accounting for them usually had “long incubation periods, typified by rule violations, discrepant events that accumulated unnoticed, and cultural beliefs about hazards that together prevented interventions that might have staved off harmful outcomes”. Furthermore, it is especially striking how multiple rule violations and lapses can coalesce so as to enable a disaster's occurrence.

[1] https://danluu.com/wat/


I hear what you are saying and agree with the danger fatigue observation, but it is different to the managers at NASA and Boeing that are reviewing the slides.

By default of your situation, you are facing these life and death decisions on a constant basis. The NASA and Boeing managers do not. I can't imagine they are part life and death scenarios very often, if at all.

Critical thinking failed them that day.


I don't think he's that far off the mark. Take the seals. This wasn't a one-off decision. This was a problem that was known for years and repeatedly ignored at lower and lower temperatures (thus lowering the allowed limits far below the original specifications) until the disaster. The reasoning being "well, you warned us last time, but everything went fine, so let's just keep going, it'll keep being fine".

True, it is easy to be wise after the event. However, the management culture at NASA, which was also responsible for the loss of Challenger, and something Richard Feynman also criticized, did not change.

This lead to managers ignoring engineers' warnings about the foam strikes on Columbia, and also rejecting requests for high resolution images.

This is exactly the same culture which ignored engineers' warnings about the O-rings on the SRBs.

Linda Ham, the mission manager who rejected these requests left the space shuttle program after the Columbia disaster and was moved to other positions at NASA. Not firing or disciplining these managers will cause similar disasters in future. https://en.wikipedia.org/wiki/Linda_Ham#Columbia_disaster_an...


> Not firing or disciplining these managers will cause similar disasters in future.

I think the quote is something like "Why would I fire them, we just lost X lives and dollars training them?"


That doesn't seem to quite square with them being reassigned. In Ham's particular example:

> "Ham's attitude, and her dismissal of dissenting points of view from engineers, was identified as part of a larger cultural problem at NASA.[2] After the report's release, Ham was demoted and transferred out of her management position in the space shuttle program."


That's the best option if you want to be able to rebuild confidence in the chain of command. The people who have been ignored before will never trust her again anyway. In such situations, you either rebuild the chain of command or the whole troop if you want to have a functioning group.

For sure they are not "idiots", but these are people who usually have done well in school/academia where saying "stop, don't do it!" is never the right answer; these are people who were never promoted to where they are now for saying "stop, don't do it!".

Recently I've seen a product owner trying to push us on a two week sprint where we needed to complete 100 story points worth of tasks while on the previous two sprints we only completed ~30 story points. To my protest, the SCRUM master sided with the product owner in saying we should go ahead and commit to the goal.

These are not stupid people, they were under pressure to deliver; unfortunately the incentives are not to listed to reasonable people, but to pushers; when the shit hits the fan, they make a big fuss and obtain even more resources to push even harder and they usually deliver -- whit 3X the costs.

Very rarely have I seen reasonable and smooth-sailing managers get to the top, usually it's the die-hard/hard-pushers/busy-appearing types that get to the top.

I think it's curse of modern society with its unlimited resources; this shit would never fly in the times of Sun Tzu or Caesar precisely because limited resources would prevent such smart "idiots" -- they'd succumb to the elements or their subordinates would do them in their sleep for being so detached from reality.


With the obvious warnings about limited sample size, anecdotal evidence and so on...

I've encountered a few people with this trait, and the general trend I've noticed is an excess of optimism and a hint of complacency.

"I have the utmost faith that my team can do 100 hours of work in 25!"

"We did 75 scheduled hours of work in 15. That means we can easily do 100 in 25." (completely overlooking that the team - by that point - had been working from 7am until 9pm for the entire week, ordered food in and had to take several days PTO to recover)

Curiously these are usually the same managers who demand crunch-time, but "have a prior engagement" when the team asks if they'll be helping too.


I don't think anyone is saying that it was malicious, but it was absolutely negligent. For people with decisional authority, organizational pressure is not an excuse. For people with responsibility, lapsed vigilance is not acceptable. They weren't forced to make a split-second decision, and they weren't fatigued. While there were contributing systemic factors, this was still a mistake to dismiss the information so quickly. They're professionals, not amateurs.

It's easy for us to past judgement in hindsight, but what is an acceptable failure rate? For Apollo, I believe it was around 7-8%. There's always going to be some risk, what you don't see is all the times those administrators correctly identified a risk as acceptable, or scrubbed a mission due to inaccurate risk assessment.

A space program that demands better than six-sigma risk for example is unlikely to ever get off the ground.


Although I'm technically judging in hindsight since I wasn't there, I don't think I'm being unfair in my judgement. I have experience in safety and I've seen better and worse safety reviews than this one. Although there were plenty of external factors that pushed people to act the way they did, the people we're talking about were the responsible authority for vetting these issues. It was their job to take ownership of the issues and their inquiries, and they failed to conduct themselves appropriately.

Just because their mistake didn't push the failure rate past the acceptable limit doesn't absolve them of anything.


> I’ve always found it fascinating how organizational structure and pressure can take really brilliant and motivated people and beat them into making such poor decisions.

I get your sentiment, but there are more stories to tell in NASA events. Robert Trivers wrote a book [1] about his research on human behavior, specifically, how self-deception plays a role in the event.

[1] https://www.amazon.com/Folly-Fools-Logic-Deceit-Self-Decepti...


> There’s definitely audio from all those meetings

Are these accessible publicly? Could someone link to them, if available online? It's not easy to get a deep view into organizational decision making. The least we could do is to learn from mistakes.


It's important to remember that it was their job to be mindful of the risks. These weren't folks working at a bake sale or something, these were trained professionals who had been entrusted with significant and weighty responsibilities, and they did not take those responsibilities seriously enough. No, they weren't idiots, and that makes it far worse. Negligence is a common human failing, we should remember what it looks like and how it happens, we shouldn't pretend that it doesn't exist or that it should be excused.

They acted irresponsibly. They should respond for their actions. They had a responsibility and failed miserably.

I have no pity for such lack of professionalism. You are correct in saying that it is not so easy and that these structures take a life of their own but this shouldn't be an excuse and we see disasters happening over and over because people don't act as professional.

Anyone on the room seeing such mistakes and who was capable of standing up, should have, even if it meant they could end up jeopardizing their careers by doing so.

* I'm not asking for a hero and I understand that failure is part of the human nature but we should have [professional] respect where it is due.


> They acted irresponsibly. They should respond for their actions.

It's satisfying to point to a handful of people to place blame, but it isn't terribly scientific. We should be asking why they asked irresponsibly. The folks making these decisions didn't just act in a vacuum, they were a product of NASA, systems engineering at large, a bureaucratic institution, and our own societal norms.

It's likely that others who went through similar training and operated in a similar environment would have made the same decision. But even if the bad call was due to a few bad apples that inserted themselves into the decision making process, we should ask how we allowed them to get there.

Punishing an individual may or may not be warranted in this case, I suspect that the guilt they must live with is punishment enough. However, what's clear is that punishment won't be enough to prevent similar problems from arising again.


> We should be asking why they asked irresponsibly. > I suspect that the guilt they must live with is punishment enough. We should ask this for the sake of science. However, it is not only accident prevention here that should be at stake here. They signed off and should be held responsible. I'm not even talking about punishment here but about repairing the damages done.

This is a question of justice first of all. Punishing for the sake of punishment is not the way forward. It is true that people will think twice [before dismissing such kind of complains] if they see that you can't get away with [unintended] murder but this should be a secondary, positive side-effect only. If you try to punish to send a message it might be unjust, send the wrong message (like, silencing whistleblowers that have doubts or people from fixing their own mistakes), or both.


> This is a question of justice first of all.

No it's not. This is a question of optimizing spaceflight for safety.


Clearly the mission controllers bear some responsibility, but I disagree with diverting blame from the creators of the PowerPoint.

The PowerPoint displays extremely clouded thinking. The title is: "Review of Test Data Indicates Conservatism for Tile Penetration." What the heck does that even mean? An unclear title is a very strong indicator of unclear thinking about the point being made. Further, there is a relatively precise estimate of the size of the foam, and its orders of magnitude larger than the size used for the tests. The slide clearly states that the penetration velocity depends on "volume/mass of the projectile"--i.e. one would expect a much larger projectile to penetrate at a lower speed. So why is the lead message not "we have to try the tests with bigger foam, because we don't have the data we need to reach a proper conclusion?"

Remember, while this is happening, a million other things are going on. The officials have other risks and trade-offs to deal with. In that scenario, one of your jobs as an engineer is to clearly convey your point, and perhaps more importantly, the limitations of your analysis. It's not the job of the decisionmakers to tease that information out of you. Ideally, of course, a decisionmaker faced with unclear messaging will try and get to the bottom. But in a high-pressure scenario, that doesn't always happen. For one thing, how does the official even know which issues need to be run to the ground and which do not? Who was the person who best knew how significant this slide was? The official reading it (who probably saw a hundred other slides in the same meeting), or the person who wrote it?


Yeah, I agree. This is one of the worst slides I've ever seen. A few things I'd add to the above.

- The title is extremely unclear: are they supposed to be conservative in their beliefs about the risks (i.e., to not believe that the tile got penetrated), or are they supposed to be conservative in their behavior about the consequences (i.e., reduce risks by not re-entering the damn shuttle)?

- wtf does "overpredicted" mean? That's the second largest font, and it has no visible meaning?

- They don't even attempt to estimate the speed that the foam was traveling. They just said that it depends on speed and mass, but they make no suggestion of the speed.

- Nor do they attempt to draw boundaries on the possible speeds. Could they have fit a line on the test data and at least given a range of speeds that were clearly dangerous?

I don't think this is merely bad presentation design, however. This feels like someone was afraid to stick their necks out in a bureaucracy, and hence didn't give the managers enough information because there were some error bars around it.


> wtf does "overpredicted" mean? That's the second largest font, and it has no visible meaning?

Honestly, it reads to me as being along the lines of "overestimated" - and in this sentence would mean "reality wasn't as bad as our predictions, so we have a lot more leeway than we thought".


A mathematical model (here, the crater equations) predicts something. If the prediction turned to be larger than reality, you could say that the model overpredicted.

What you should say is that the model is wrong.

That too. "Overpredicted" is a particular way in which this model turned out to be wrong for the problem it was applied to.

> "Review of Test Data Indicates Conservatism for Tile Penetration."

The title has nothing to do with power point. This is no different from how academics title papers. The use of unnecessarily big words & passive tenses (rather than active) increases with level of schooling.

"Don't bury the lede" - Journalists and writers are taught this. It means, save readers some mental clock cycles. Pore through your own news and figure out the most important fact. Then put it in the title and/or on top.

Highly schooled, intelligent and "honest" people don't do this. When journalists over simplify or make logic leaps in their writing, we call it clickbait. We also prefer the use of the words MAY, and COULD (probability isn't intuitive to most people) - which allows people to ignore the information.

Successful salesy types are intentional about their use of words. They'll use active phrases, avoid ambiguous words, use words that denote certainty to force action in choice areas, and vagueries to hide or down play others.

The officials were surely served papers before or after the meeting and made their decision despite the slideshow.

Summary: Engineers need copyrighting and sales training as well.


> Summary: Engineers need copyrighting and sales training as well.

Waaait. Your comment essentially established two things: that journalists receive training on how to write properly, and that successful salesy type know how to bullshit their audience to get what their want. I think the conclusion doesn't follow.

Observe that journalists universally don't apply their writing training - in fact, modern news reporting is one of the worst kind of writing out there, with the lede buried under 30 meters of gravel, and spread on a hectare of land (we call it clickbait only when headline is manipulative). So this shows your training matters little if incentives on the job are in total opposition to it.

As for the sales angle, commercial copywriting isn't exactly a paragon of clarity either. Effectiveness in sales isn't measured in how clearly you communicated costs and benefits, but how excitingly you communicated the benefits, and how effectively you've hidden the drawbacks. Engineering communication shouldn't be manipulative like this.

The way I see it, many engineers could use some communication training, but it should be focused at presenting things clearly and truthfully, and on effectively ELI5-ing things to managers. But beyond that, incentives within organization needs to be adjusted, because it's hard to get an engineer to explain things clearly when their job depends on them not doing it.


> The way I see it, many engineers could use some communication training, but it should be focused at presenting things clearly and truthfully

There are technical communications courses, in fact, it's a required course at Cornell Engineering. Agreed, it's not at all the same as copyrighting or journalism, but the general level of technical communication is so poor, that even courses in those would be an improvement.


Did you mean copywriting?

Just because you don't understand the jargon does not make it incorrect/unclear, have you ever worked at NASA?

Use of jargon is a huge impediment to communication in organizations like NASA and DoD. There's even a glib saying that you won't get your project funded unless you give it a cutesy/cool acronym. This ends up stifling communication because each group has its own pet vernacular that obfuscates meaning unless you are inside the circle.

They could benefit from clear, concise communication that gets to the point


Read the full deck and tell me what is hard to understand:

https://www.nasa.gov/pdf/2203main_COL_debris_boeing_030123.p...


Indeed. The conclusion is:

```` Contingent on multiple tile loss thermal analysis showing no violation of M/OD criteria, safe return indicated even with significant tile damage ````

That's going to be difficult for management to push back against.


Thank you for posting the full deck. The conclusion is pretty clearly stated that the engineers thought the shuttle would return safely, even with missing tiles. The focus on this one slide out of context seems totally wrong.

"Contingent on multiple tile loss thermal analysis", i.e. "we're running that model now and until we're done we're not sure, but the tests we did so far suggest that a tile loss should not prevent safe return".

It isn't really that unclear. There are formal words, but - unlike typical managerial presentations - there's also content behind them.


So the real slide was actually quite different from the one presented in the article. The article shows various errors not present in the real slide (vaires, e.?g., Ln, hanrd). The real slide also use different font weights and bullet symbols. Was that intentional to give a bad impression, or just really sloppy? Or were those slides cleaned up?

I have mixed opinions. I agree that they conclusion is much more clear than the single slide implies. However, the slides do a poor job communicating in many other respects. For example, on the first non-title slide four separate acronyms are used, an only one is defined. (Incidentally, it's one that also has another, different, acronym within industry - M/OD => MMOD). Maybe everyone in the audience already familiar with these terms, but maybe they're not. I think in this case clarity should trump brevity. I think the original site has a particular, biased point of view but also that NASA can often do a poor job communicating.

The part I'm grappling with is how they came to that conclusion despite the "flight condition is significantly outside the test database" acknowledgement as alluded to in the original post. To me, this sounds very much like Challenger in terms of drawing conclusions without hard data to back it up. Easy arm-chair quarterbacking in hindsight, I know, but it seems the through-lines are psychological in nature, not engineering or technical problems.


If NASA is encoding safety-critical messages in "jargon" that can pass for misleadingly ambiguous plain English, that's a huge safety problem itself. What if it's somebody's first day on the job?

Especially given the sorry state of the presentation in general, I really doubt this is the case, though.


Have you ever presented inside a collaboration?

Yes, and my experience doing so only cements the opinion(s) expressed in my previous comment—to which I'm inclined to say your reply is not terribly relevant.

Ambiguous, poorly presented, and technically dense are three different things, though they can all be present in the same place. Only two of them are necessarily bad.


So what exactly is technically dense/ambiguous in this summary presentation (remember this isn't the only information provided). Perhaps you'd like to review the full deck of slides rather than the single one picked by the asshole author, I'm particularly interested in the one titled "Damage Results From “Crater” Equations Show Significant Tile Damage" . https://www.nasa.gov/pdf/2203main_COL_debris_boeing_030123.p...

The title, "Review of Test Data Indicates Conservatism for Tile Penetration" is completely ambiguous to the point of being nonsensical. It is impossible to know in which direction the author of the slides thinks we should be conservative.

Ironically the slide you point out ("Damage Results From 'Crater' Equations Show Significant Tile Damage") seems to skew our interpretation of the next slide's title and content in the wrong direction. The former slide states that "'Crater' indicates that multiple tiles would be taken down to densified layer". That sounds bad. However, it also tells us that the "program" (assuming this means the Crater model) that generated this alarming prediction was "designed to be conservative". This would seem to downplay any concern generated by this result. Furthermore we are told that "Crater reports damage for test conditions that show no damage", further casting into doubt the predictions of the model.

And then, on the next slide: "Review of Test Data Indicates Conservatism for Tile Penetration". On the previous slide the word "conservatism" was used to tell us that the results from the "Crater" model may be on the high side, i.e. showing a problem where there is none, and that the test data show a much smaller degree of damage. This slide is about test data ("Review of Test Data"). On both slides we are told that compared to test data, the Crater results are inflated: "Crater reports damage for test conditions that show no damage", and "Crater overpredicted penetration [...] significantly".

Where does this leave us? The context of this additional slide makes the presentation even more misleading than the "asshole author", to use your delightful term, thinks it is. The title is not merely ambiguous; we are explicitly nudged toward the wrong interpretation of it. I have to thank you for bringing this additional context to my attention.

The author may in fact be an asshole (though I think your language is inappropriate in context), and his analysis may lack depth (I think it does to some degree), but in the simple matter of this slide being egregiously awful he's totally right. I don't know why you'd choose this hill to die on.


This is a great point. The beginning of the disputed slide continues to build up a point—Crater (the prediction software) is overly conservative. The major caveat that the test data is far afield from the actual situation is buried at the bottom of all that.

Seconding your comment. The presentation isn't ambiguous, it's just written in a detached, conservative style typical to scientific papers.

I wonder if they would really would have been any clearer without PowerPoint. The person who wrote that slide seemed fully capable of obscuring the message in any medium.

Your post reads like one of those classic "Guns don't kill people, people kill people" slogans. Sure, but guns make it a whole lot easier. No one is innocent here - there's plenty of blame to go around.

One of the key skills in being a senior engineer, is knowing how to communicate clearly with managers and decision-makers who don't have the same level of technical knowledge that you do. Communication is an extremely imperfect and lossy medium - figuring out how to get the most salient message across clearly, is a hard-to-master but vital skill. This article shows exactly why.


No; what the article is doing is equivalent to blaming the whole World War II on MP 40 machine pistols. Sure, that SMG killed a lot of people, but it's not the only gun that killed people, it's not the gun that started the war, and it had almost nothing to do with the causes and the course of the conflict.

A PowerPoint presentation did not kill the astronauts. Hell, judging by the full slide deck (elsewhere in the comments), the presentation wasn't even that bad - it was detached, the way scientific papers are detached. Could it be better? Yes. Should it be blamed for this? Not really.


I gotta wonder what business a manager has being in a decision making position if they can't understand the summarized technicals of those below.

Guns don't kill people. People make very specific decisions that lead to someone's death. I find any argument that blames a tool for someone's death to be suspect when a person or in this case a group of people met and discussed the issue. This wasn't a broken tool that directly lead to the deaths. This is especially problematic given NASA's history with Challenger, previous strikes, and the excellent writing of Richard Feynman on this very subject for this very organization.

A manager, or really anyone who desires to lead, needs to be able to remove the confusion surrounding information. If someone was deliberately not giving information then that another problem, but the slide had the critical phrase at the bottom showing this was way outside the test parameters (600x with some basic math).

I don't deny that the slide was bad and the engineers who put it together failed their profession, but the person at the top needs to be able to get by this.


Actually, despite the headline the article states it clearly.

>> NASA managers listened to the engineers and their PowerPoint. The engineers felt they had communicated the potential risks. NASA felt the engineers didn’t know what would happen but that all data pointed to there not being enough damage to put the lives of the crew in danger. They rejected the other options and pushed ahead with Columbia re-entering Earth’s atmosphere as normal.

This is it. Not any slideshow or wrong communication of risks, but lack of responsibility.


The mission leader may have signed the death warrant, but the stage of the execution was set by the engineers who wrote that passive voice nonsense.

It is a little far fetched to think the audience didn't listen or ask questions about any potential risks.

It's easy to go back after an accident and find multiple points of failure and find things that could have been done better.

My guess is that most people thought it was a minor or acceptable risk. Unfortunately, they were very wrong.

But that doesn't mean the PPT slide is at fault.


> But that doesn't mean the PPT slide is at fault.

I think it's a bit of poetic license to say that the PPT is at fault. But didn't help. Many mistakes lead to an outcome like this one. But this PPT is one of those mistakes. The basic information is there for a very significant point--we know penetration can happen with a sufficiently large or sufficiently fast-moving piece of foam, and this piece of foam is 600 times bigger than anything we tested before. Had this PPT clearly conveyed that point, the chances would have been higher that some decisionmaker would have realized its significance.


> I think it's a bit of poetic license to say that the PPT is at fault. But didn't help.

Poetic license in journalism doesn't help either. What we got here is pure clickbait, literally accusing something of killing people, when it was just one among many things that were suboptimal and just "didn't help".


I think the PPT slide is at fault for the fact that the most relevant piece of information (this is 600x bigger than we have tested) got lost. The decision, and even the way in which the meeting to make the decision was held, is not on PPT. But, it was the one and only purpose of a PPT slide to communicate information in a way that makes it easier to understand, otherwise we would just use plain txt files.

A plain txt file here, would have been better.


How would a plain txt file have helped? As the person writing the txt/ppt/whatever it's your job to put the emphasis on the important part: the 600x difference.

In text this means puting that at the beginning, in the title and in bold. The same is true in PPT, you have to outline the important stuff. It has to be visible at a glance.

File formats don't solve lack of communication and listening skills.


Certainly not. The fact that plain txt would have been better than this, was meant to convey how poorly the format worked for the intended purpose (highlighting the most important information).

I wasn't saying a plain txt file was a solution; I was saying that even plain txt, a very low-power format, would have been better than this, and since the ONLY PURPOSE of a PPT is to help you format things, this was a failure. Certainly no format will solve lack of communication, but I see better communication in Hacker News comments than I see in PowerPoints, and I think that says something about the value of PowerPoint.


But it wouldn't have been better.... At best it would have been equally bad.

Yeah, it's clear that the slide was poorly written, but it seems very odd to put the lion's share of the blame on the people who presented accurate information poorly, instead of the people who thought it was okay to merely skim a <150-word slide when seven lives depended on their comprehension.

Why on Earth did they veto even doing a spacewalk to inspect the damage, before concluding it was safe?


I don’t agree. I work in the Danish public sector, which means we handle healthcare related software that could potentially lead to deaths.

Our managers aren’t technical wizards, and especially not in the field of healthcare. They are mostly doctors with a masters in management. Their job is to listen to what everyone has to say, and then make a decision based on that.

Our job is to make sure they don’t miss important points related to subjects they may not fully understand. Burying a “may cause significant damage” on a text-heavy slide like that is simply put poor communication.

It should’ve been the only words on a slide, it should have been in bold red, and, it should have said “if you ignore this, people may die”.


I believe the logical reasoning to be: Powerpoint presentations facilitate filling in the blanks without prompting careful weight and consideration. As a tool it is more likely to influence poor thought and communications structure and muddle high level intake and review of data.

A more correct communications format might be the traditional longer-form report. A main document frames facts, possibly with graphs and other features used to reach the conclusions and explain why different points are salient. An executive summary preface presents the conclusions of skilled individuals and their core reasoning, which lets non-experts reference the larger report for a better understanding and/or ask questions if they are still unclear.


The engineers aren't blameless. They clearly fucked up as well. There is plenty of blame to go around.

But the managers should have carefully read every word that was placed in front of them. They should have asked questions if anything was at all unclear. This was, ultimately, their responsibility and their decision. They chose to rush it, and people died.

And I'm just baffled as to why they didn't send someone out to inspect the damage visually. The Columbia wasn't carrying a remote arm to quickly and safely maneuver an astronaut to the observation site, but the crew were trained and equipped for emergency EVA. It would have been a minor risk, but nothing compared to the danger of reentry failure.


Spacewalks are neither free nor totally safe either. It is possible they calculated that the risk of a spacewalk and the time it would take to do that was worse than attempting re-entry. They were probably wrong, but it's not a free operation.

With hindsight, we can say they were definitely wrong.

Only if you look at things deterministically, which is usually the incorrect approach. It was almost certainly wrong. But you are working with incomplete information.

I completely agree. Powerpoint may cause information to be lost, or encourage sloppy speaking skills. But the deaths regarding "NASA's powerpoint that kills" is squarely on the chuckleheads who didn't sit down and get to brass tacks and ask the hard questions.

Powerpoint didn't stop NASA employees from asking "What are the chances of a catastrophic result?"

In times of critical issues, you ask plain and direct questions. You will always be given caveats, muddled answers, and "but ifs". As a decisionmaker, it's your job to 'cut the crap'.


> Powerpoint didn't stop NASA employees from asking "What are the chances of a catastrophic result?"

But the bureaucratic morass that Powerpoint embodies does cause people to stop asking questions like this in all forms of industry. You have probably seen it - at much lower stakes - in business, finance, or IT.

The post is picking at the nits of Powerpoint's layout, which I agree with you, seems silly. But the broader picture to blame meeting culture is quite accurate.


> Powerpoint didn't stop NASA employees from asking "What are the chances of a catastrophic result?"

How so? The whole practice of using Powerpoint, whether in business, other enterprises, education or elsewhere is literally designed around its nature as a persuasion technology that makes true shared deliberation impossible, by ensuring that everyone in the audience has to expend their mental effort to focus on what the Powerpoint slides purport to say. Even Tufte is rather clear about this, and consistently critical about Powerpoint use. There is a very real sense in which Powerpoint stopped the audience from asking the right question.


The slide in the article has the same text, but is a recreation of the original (The Calibri typeface used wasn't part of PowerPoint until 2007).

The original slide can be seen in the full report linked in the article:

https://www.edwardtufte.com/bboard/q-and-a-fetch-msg?msg_id=...


Why would someone recreate that slide instead of just using the original? Gah, that's sloppy reporting. Makes me suspect the entire story told, now that you pointed that out.

There's other sloppiness here as well, like treating the orbiter velocity as the determining factor (nine times the speed of a bullet...) when the delta-V between the foam and the orbiter is the more pertinent information.

The author comes across like they have an axe to grind (maybe rightly so) but should make better efforts at getting technical details right in an article about the perils of miscommunicating technical details


Thanks for pointing this out. I noticed the typo of "hanrd" ("hard") on the re-creation and drew the further conclusion that they weren't even taking the issue seriously enough to pay attention to their slide content. Even though it doesn't change what ultimately happened, it's nice to know that someone was paying at least cursory attention.

Indeed. I missed "hanrd" on my first read through, but "Vaires" and the stray "?" jumped out at me. The original slide is much more readable, and "Flight condition is significantly outside of test database" stands out as almost the conclusion rather than being buried in noise. Talk about a strawman. (Still a bad slide, though.)

So if I'm reading this right, the previous testing was done with a 3 cubic inch foam block at 200 ft/sec impact velocity... and the foam ramp was 1900 cubic inches and going 900 feet/sec at impact.

This is like saying "shooting someone in the head with a BB gun doesn't kill them, so we think shooting them in the head with a shotgun slug should be fine."

What on earth were these people thinking?


That same linked page by Edward Tufte (of data visualization fame) has a proposed revised slide that would have more adequately conveyed the risk

As much as I appreciate the sentiment and I never like to miss a chance to pile on PowerPoint, this is really, really missing the point.

As the CAIB report makes clear, the PowerPoint slide was a small symptom of the actual problem of a complex organization gradually accepting more and more risk as “in family” simply because unexplained phenomena hadn’t caused serious issues before (while remaining unexplained). The CAIB report really is a masterpiece (as is Feynman’s appendix to the Challenger report) of understanding how the understanding of risk can be subjugated to organizational pressures over time.


> The CAIB report really is a masterpiece (as is Feynman’s appendix to the Challenger report) of understanding how the understanding of risk can be subjugated to organizational pressures over time.

Agreed. It is a useful document to read for anyone who plans meetings and heavily relies on consensus to see where the problems lay. You need outsiders and first principles thinkers (physicists are good options, as Feynman always demonstrates) to disrupt bureaucratic agreement.


Yeah, I think this is right. Which doesn't detract from the slide being terrible, but just adds a bit more about why the slide was terrible.

Edit: from the summary of chapter 7 of the CAIB report [1]:

"organizational barriers which prevented effective communication of critical safety information and stifled professional differences of opinion;"

http://s3.amazonaws.com/akamai.netstorage/anon.nasa-global/C...


During that final Columbia flight, I remember reading a short news story on the internet on the foam falling off. It even had a short video of the foam falling off. (Somewhat of a novelty way back in 2003.) I watched that video several times.

I remember being concerned, but confident that NASA would figure it out.

Then when I saw the headline that Columbia didn't land, I remember immediately thinking, "On no! The foam!". I also remember being puzzled that none of the news stories after the crash mentioned the foam for a long time.

I've tried to go back and find that news story, but I have never been able to find it.


This sort of thing seems to be standard in journalism. For some reason, the articles never seem to ask the obvious questions or make the obvious connections. I've always been so puzzled by this but I guess it's because there's no incentive to speculate.

This is a rehash of the claim Edward Tufte.made about the Challenger crash, and it is doing the same thing: take something that was, at most, one of the many contributing factors (but the one of most interest to the person making the claim) and exaggerate its significance out of all proportion. It is not a helpful way to present data if the goal is to understand what went wrong in the hope of avoiding making the same mistake in future.

We weren't at the briefing. Surely the briefer could have emphasized these points, rather than just relying on the slide itself to convey the seriousness of the situation. Or, the judgement of the audience to pick up that they hadn't tested for this situation.

I like Tufte as much as anyone else, but he's in the business of selling courses.


The speed listed in the introduction is wildly wrong. The foam could not possibly have hit at 28968 km/h - that is the approximate orbital speed of the shuttle. At 82 seconds into flight, the speed is about 700m/s (2500 km/h).

Even using that figure would assume that the foam came to a dead stop instantaneously after detaching from the tank. I wouldn't be surprised if the relative speed was only 1/10th of that, putting the speed of the collision around 250km/h, less than 1% of the figure stated in the article.


> The speed listed in the introduction is wildly wrong. The foam could not possibly have hit at 28968 km/h

The article said "[a]s the crew rose at 28,968 kilometres per hour the piece of foam collided with one of the tiles". It did not say that it collided with the tiles at 28k km/hr, just that it collided with it during it's acceleration to that speed. So technically it's correct, but I definitely agree that it it's a false implication, likely added for dramatic effect.


In the next sentence the author also says the damage was caused by foam "hitting the wing nine times faster than a fired bullet." A bullet goes about 900m/s, and the "nine times" figure often symbolizes orbital velocity. It's clear that the reference is to the speed of the collision.

Actual slide deck. Personally I think the author of this blog is an a-hole to highlight a single slide.

https://www.nasa.gov/pdf/2203main_COL_debris_boeing_030123.p...

Particularly interesting is the slide titled "Damage Results From “Crater” Equations Show Significant Tile Damage"


> They rejected the other options

I didn’t think there were any options. Indeed the managers didn’t think so.

From https://en.wikipedia.org/wiki/Space_Shuttle_Columbia_disaste...

> Throughout the risk assessment process, senior NASA managers were influenced by their belief that nothing could be done even if damage were detected.

However the CAIB determined Atlantis could have been used as a rescue vehicle had NASA acted quickly enough. It also put forth a high risk repair procedure:

https://en.wikipedia.org/wiki/Space_Shuttle_Columbia_disaste...


There were a number of things they could have tried. No one of the alternatives (except launching a rescue shuttle) could possibly have turned out worse than the course they chose. Instead of an Apollo-13 style "Failure is not an option" effort, Jon Harpold, then the director of Mission Operations, is reported to have said

"You know, there is nothing we can do about damage to the thermal protection system. If it has been damaged it's probably better not to know. I think the crew would rather not know. Don't you think it would be better for them to have a happy successful flight and die unexpectedly during entry than to stay on orbit, knowing that there was nothing to be done, until the air ran out?"

https://www.theguardian.com/world/2013/feb/01/columbia-space...


In my life up until now, I'd always believed the best of people and doubly so of NASA. Seeing that, and knowing it's true, makes me believe that there is evil in this world. And that man is part of it. Whoever and whatever forces allowed him to get where he was with opinions like that, needlessly sacrificing the lives of seven people and being satisfied that he'd made the right decision needs to be brought to account, publicly and with no means of secrecy.

It's not evil, not per se. Evil would be murdering people deliberately. This is incompetence, complacency, and apathy. It's negligence. But it's exactly what we should expect from an agency that knowingly ran an unsurvivable death trap of a launch vehicle for thirty years.

During the early stages of a shuttle launch, prior to SRB separation, the only abort mode available was RTLS--"return to launch site". Unlike Apollo, Soyuz, Dragon, or any other capsule-based spacecraft, which had a launch escape system that could immediately separate the crew capsule from the rest of the launch vehicle and place it in a safe vector to parachute back to the surface, the Shuttle was expected to pitch end-over-end, with the external fuel tank still attached, in an attempt to return to a runway near the launch pad. John Young, the pilot of the first Space Shuttle mission, declined the suggestion to perform a manned test of the RTLS abort mode, stating, "let's not practice Russian roulette." He also noted, "RTLS requires continuous miracles interspersed with acts of God to be successful".

It's not as though a LES would have necessarily saved Challenger, mind you. The explosion was too fast. But if the craft were mounted vertically with the booster stage rather than tandem, and if the abort mode didn't require having a fully intact orbiter and external fuel tank that could perform aerobatic maneuvers at the edge of possibility, there would have at least been a chance. Likewise, such a vertical-stacked design would have completely eliminated the risk of anything like the Columbia disaster.

This isn't to minimize the real, compounding negligence in how the program was executed over the years. But the program was damned to begin with.


> "After showing the astronauts in orbit a video of the foam strike and discussing with them what they thought they knew, mission managers concluded that it was a non-issue and posed no threat to the crew's safe return.

> ...

> "Although the circumstances of the tragedy have been well documented, and Hale insists there was "never any debate about what to tell the crew", his revelation brings new insight to the mindset of some Nasa employees at the time."

The crew knew.


Here is what NASA emailed to the crew about the foam strike:

--- begin quote ---

You guys are doing a fantastic job staying on the timeline and accomplishing great science. Keep up the good work and let us know if there is anything that we can do better from an MCC/POCC standpoint.

There is one item that I would like to make you aware of for the upcoming PAO event on Blue FD 10 and for future PAO events later in the mission. This item is not even worth mentioning other than wanting to make sure that you are not surprised by it in a question from a reporter.

During ascent at approximately 80 seconds, photo analysis shows that some debris from the area of the -Y ET Bipod Attach Point came loose and subsequently impacted the orbiter left wing, in the area of transition from Chine to Main Wing, creating a shower of smaller particles. The impact appears to be totally on the lower surface and no particles are seen to traverse over the upper surface of the wing. Experts have reviewed the high speed photography and there is no concern for RCC or tile damage. We have seen this same phenomenon on several other flights and there is absolutely no concern for entry.

That is all for now. It's a pleasure working with you every day.

--- end quote ---

Now does it sound to you like the crew was fully informed, or like NASA was minimizing the issue?

https://spaceflightnow.com/shuttle/sts107/030630emails/


Actually there was a similar situation during Apollo 13 having to do with possible damage to the heat shield. They had no way to know if the heat shield had been damaged or not by the explosion, and even if they did find out they couldn't have fixed it.

My readings of the Colombia disaster have yielded similar results, unless someone can tell me of a way to repair the tiles on the shuttle. Launching Atlantis early was, as far as I know, determined to be possible only afterwards. At the time they had no idea it could be done, and honestly it sounds like a chance to risk 14 lives instead of 7. Or at least 8, I'm not aware of how many people were actually required to fly the shuttle.


If he actually said that, it’s fucking unconscionable. It’s not his decision to make (or shouldn’t be), it’s the crews’.

What kind of morality underlies such thinking?


Honestly, I could understand that line of thinking if he were talking about ordinary members of the public who are utterly unprepared for a life and death situation. (Not saying I agree with it, just that I understand it.)

But when we're talking about a crew of professionals who literally spend years training to deal with disasters, and who knew exactly what they signed up for when they got in that rocket, I have to squarely side with you. They at least deserved a chance to go down fighting.


>...nine times faster than a fired bullet... I don't think so.

A NASA publication [1] gives the SOFI piece impact velocity as about 800 feet/sec. Rifle bullets reach 2.5K - 3K fps muzzle velocity routinely.

[1] https://history.nasa.gov/columbia/Troxell/Columbia%20Web%20S...


I assume the author mistakenly took the orbital velocity instead of the delta-V at impact

"Death by PowerPoint is a real thing. Sometimes literally."

So tacky. Treating human deaths as a punchline to a joke.


The authors of the slide either lacked the ability to analyze problems, or lacked the ability to communicate their opinions, or were intentionally obscuring their opinions.

Which of these is the more likely explanation?

I've seen experienced engineers incapable of expressing their thoughts clearly. But I've also seen well-established organizations that encourage hiding your opinions behind a wall of bullshit.


Have you considered the possibility that in some cases it is basically forbidden from expressing opinions. These guys were tasked to conduct and summarize their studies not opine on the decision to be made, that violates separation of duties and an expressed opinion leads to questions of diligence.

Yes, that's what I meant under "intentionally obscuring their opinions" (for example because of the organizational culture, etc).

Powerpoint was one of the problems.

A much worse problem was NASA's management stepping in to block multiple requests for imaging the orbiter.

https://en.wikipedia.org/wiki/Space_Shuttle_Columbia_disaste...


Kinda weird to even hint that the tool used to communicate would be the cause of the disaster. The same words could have been written in LaTeX and presented with a PostScript viewer running in Linux.

They very much could. In terms of language, the whole slide deck in question is essentially a scientific paper in PPT form, and scientific papers tend to be typically written in LaTeX.

I heard one possible reason for underestimating the damage was due to the move of a NASA (or the contractor?) lab/facility that dealt with this kind of issues, from California to a cheaper place such as Alabama/Florida(?).

The move was done as a cost saving measure. Quite a few of the engineers chose not to relocate with the office. When the space shuttle launched and the image was being reviewed, many of the experienced engineers (who could possibly have predicted it correctly) were no long with the team.

I remember reading above bit years ago while reading about the incident.


"there is a huge amount of text, more than 100 words"

"PowerPoint briefing slides instead of technical papers"

Would technical papers have fewer than 100 words?

If people can't be bothered to read 13 slides, what would they do with a technical paper?

If people can't be bothered to make their message clear in slides, how would they create a readable technical paper?

I don't like presentations that consist of "monotonously reading [bullets] as we read along" either. But what does PowerPoint have to do with this? I've seen such presentations done with trendy web tools instead; they don't change anything. I've also read full technical papers written in LaTeX that didn't manage to get their point through the unclear writing (and vaguely misused technical jargon buzzwords).


> If people can't be bothered to make their message clear in slides, how would they create a readable technical paper?

Slides are not the same as or replacements for technical papers. Even by comparing them, you're making the same mistake as these engineers did. Compared to a paper, the downsides of a presentation are that they are typically constrained to just 30-60 minutes, in front of a group of people who may be checking their phones and only be half-engaged. Also, by its very nature, the presenter is speaking, so it's difficult to simultaneously read the slides while listen to the presenter.

The upside of a presentation however, is that you can give it to many people at once, and solicit feedback in real-time.

Accordingly, presentations like this need to be far tighter than technical papers. There needs to be much more work invested in prioritizing the issues that you discuss, they should take advantage of the advantages of the form (e.g., utilizing the skills of the audience), and minimize the disadvantages (short time-frame, and no monopoly on attention).

As a starter, you should never put a single word on a slide that you do not say out-loud. There's nothing worse than putting a block of text on a screen and talking about it. The astute audience members will listen to you while reading the text, and in the process not internalizing either. Most audience members will just have their eyes glaze over and then check their phones.

Sadly, engineers are often taught how to write technical papers, but not how to give effective technical presentations.


> you're making the same mistake

I assume you mean the "generic you", as it was the article quoting the NASA report that suggested the comparison.


I am not sure if the author and commenters here are specifically condemning Microsoft Powerpoint, or just computerized slide presentations, of which Powerpoint is by far the most common type. Because even if everyone used Apple's "Slides" tool, they could still make a lot of the same mistakes. Even if they drew things out on clear plastic sheets, they could still make the same mistakes.

So is it really Powerpoint specifically? Maybe. I'm open to that possibility. But I'm more sympathetic to the idea that Powerpoint has enabled many more people incapable of creating quality presentations to deliver them anyway.


No, this article is not a criticism of PowerPoint specifically.

I seem to remember it differently, what killed seven people was senior management choosing to ignore warnings about ice debris strikes on the wing during launches. On this one, they know ice had struck the wing, they choose not to have ground placed telescopes survey the shuttle, choose to not have the crew do an EVA and choose not have have a rescue shuttle sent up. The title has a nice ring to it all the same.

Tufte makes a good point that the critical information could have been conveyed so much more effectively.

If only the managers had been given a better summary! But I think this is a vast over-simplification.

Even with a crystal-clear summary of the issues, it doesn't always add up to a clear disaster on your hands. Only in hindsight. The shuttle was an incredibly complex system and there were always issues to examine, to fix, to prioritize, to defer. There is just a lot to regularly weigh.


Not the person that made the powerpoint? Not the team that reviewed it? Was this done by an 10 man team in some tiny start up? I mean seriously.

Exactly. This article is like one of those BusinessInsider click-baits.

Ironic, the woman responsible for the catastrophe: "...rather than spending the day just listening to keynotes..." https://www.youtube.com/watch?v=E2Pruxom9-8

> "It was impossible to tell how much damage this foam had caused hitting the wing nine times faster than a fired bullet."

This figure is GROSSLY inaccurate.

From: https://history.nasa.gov/columbia/Troxell/Columbia%20Web%20S...

>Eighty-two seconds into STS 107, a sizeable piece of debris struck the left wing of the Columbia. Visual evidence and other sensor data established that the debris came from the bipod ramp area and impacted the wing on the wing leading edge. At this time Columbia was traveling at a speed of about 2300 feet/second (fps) through an altitude of about 65,900 feet. Based on a combination of image analysis and advanced computational methods, the Board determined that a foam projectile with a total weight of 1.67 lb and impact velocity of 775 fps would best represent the debris strike.

So somewhere between 775fps and 2300 fps. For reference, slow and heavy 45 ACP bullets start at around 800fps and up. 7.62x39mm (AK-47) bullets are in the neighborhood of 2300fps. The shuttle was moving as fast as a moderately fast rifle bullet, and the foam likely hit at much less than that; something probably a bit under a subsonic pistol bullet.

The author is incorrectly assuming that the foam hit the shuttle at orbital velocity, which obviously couldn't be the case because the shuttle was nowhere even close to orbital velocity at the time.


So do they still use PowerPoint like this?

I thought this was going to be a more direct kill, as in maybe a slide that flashed bright colors very quickly and killed people who got seizures from it.

TLDR; Engineers had found that there was risk of foam detachment, NASA managers thought that risk was not flagged significant to halt the mission.

In retrospect, it's easy to blame decision makers but here's the thing: If I told you that risk of you dying is 1 in 103 if you drive the car today, would you still drive? Relatively risk of fatal accident in Space Shuttle program was 1 in 62.


What a surprise that it was from Boeing. I don't like them at all.



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: