Bad Astronomy

Apollo 1 redux: the inevitability of disaster

My friend Jim Oberg has written an article about the Apollo 1 tragedy. He is a space historian, and he knows his stuff. His angle on this was 1) the loss of these people and the missions were due to preventable mistakes, including arrogance at key places in NASA administration, and 2) we need to learn from these mistakes; not about hardware or safety, but about the roles of human beings in decision-making when it comes to spaceflight.

This is very different that the way I wrote my own Apollo 1 article. But I think we both have a point.

When I wrote the essay, I did not mean to exonerate the people of mistakes made. Jim is right; the disasters of Apollo 1, Challenger, and Columbia are the more tragic because they were preventable. I think all disasters are preventable, and that in every case it is probable that if only someone hadn’t been careless, forgetful, greedy, ambitious, foolish, then lives would have been saved. In these three cases, I agree with Jim, especially with the Shuttles. People in positions to prevent these disasters were told what the problems were, and forged ahead anyway.

Complacency kills just as much as a holed Shuttle wing, or a frozen O-ring.

As Jim wrote about current NASA workers:

They need the consequent inescapable ache of fear and the gnawing of doubt that keeps asking, over and over, if they’ve covered all angles and done all they can. And if their stomachs do not knot up, and mouths go dry, as they confront such decisions – perhaps they need new jobs.

Again, people will die as we explore space. Sometimes these deaths will be due to human error, human stupidity, human weakness. Sometimes things will just happen – Nature is just that way.

We must learn from these mistakes and do what we can to minimize them. No human, or even team of humans, can possibly avoid every potential mistake. What we need, and what Jim advocates, is a system to prevent the preventable mistakes. That may sound like a tautology, but it isn’t. If a disaster happens, and people have done all they can to prevent it, that is what it’s like to explore. But we also need to make sure that human fallacy is not at the root of the problem.