> I had to look over pictures of parts broken from a crash and have the potential feeling of 'what-if that's my calculation gone wrong'.
Does it inevitably come down to that for someone? I mean even if its a detail that a procedure couldn’t have caught, someone is responsible for forming good procedures. I suppose there could be several factors. But it seems like ultimately someone is going to be pretty directly responsible.
Just interesting to think about in the context of software engineering and kinda even society at large where an individual’s mistakes tend to get attributed to the group.
"But it seems like ultimately someone is going to be pretty directly responsible."
Or many people, or no one directly. Space missions come with calculated risk. So someone calculates the risk that this critical part brakes is 0.5% and then someone higher up says, that is acceptable and all move on - and then this part indeed brakes and people die.
Who is to blame, when the calculation was indeed correct, but 0.5% chances still can happen (and itnwould be a lot)? And economic pressures are real, like the limits of physic?
See Murpheys Law, "Anything that can go wrong will go wrong." (Eventually, if done again and again)
Astronauts know, there is a risk with every mission, so do the engeneers, so does management. Still, I cannot imagine why anyone thought it was an acceptable risk, to use a 100% oxygen atmosphere with Apollo 1, where 3 Astronauts died in a fire. But that incident indeed changed a lot regarding safety procedures and thinking about safety. Still, some risk remain and you have to live with that.
I am quite happy though, that in my line of work, the worst that can happen is a browser crash.
Even after the fire, the Apollo spacecraft still used 100% oxygen when in space. The cabin was 60% oxygen / 40% nitrogen at 14.7 psi at launch, reducing to 5 psi on ascent by venting, with the nitrogen then being purged and replaced with 100% oxygen.
Wow, this detail I did not know yet. It just was a reckless rush to the moon, no matter the cost at this time. Without the deaths, nothing would have changed probably.
For what it's worth, I strongly disagree -- the group as a whole (and especially its leadership) is responsible for the policies they decide to institute, and the incentives they allow to exist. For example, in this article's story the author is apparently working >80 hour weeks directly manipulating the $500M spacecraft two weeks before it launches. Do we really think they are "directly responsible" for the described mistake? I think a root cause analysis that placed responsibility on any individual's actions would simply be incorrect -- and worse, would be entirely unconstructive at actually preventing reoccurrence of similar accidents.
I think this is furthermore almost always true of RCAs, which is why blameless post-mortems exist. It's not just to avoid hurting someone's feelings.
Does it inevitably come down to that for someone? I mean even if its a detail that a procedure couldn’t have caught, someone is responsible for forming good procedures. I suppose there could be several factors. But it seems like ultimately someone is going to be pretty directly responsible.
Just interesting to think about in the context of software engineering and kinda even society at large where an individual’s mistakes tend to get attributed to the group.