I'm not saying NASA didn't fuck up by any means. I'd just like to point out that hindsight bias is a very real effect that can have huge consequences when planning for future events: http://lesswrong.com/lw/il/hindsight_bias/
>Viewing history through the lens of hindsight, we vastly underestimate the cost of effective safety precautions. In 1986, the Challenger exploded for reasons traced to an O-ring losing flexibility at low temperature. There were warning signs of a problem with the O-rings. But preventing the Challenger disaster would have required, not attending to the problem with the O-rings, but attending to every warning sign which seemed as severe as the O-ring problem, without benefit of hindsight. It could have been done, but it would have required a general policy much more expensive than just fixing the O-Rings.
>Shortly after September 11th 2001, I thought to myself, and now someone will turn up minor intelligence warnings of something-or-other, and then the hindsight will begin. Yes, I'm sure they had some minor warnings of an al Qaeda plot, but they probably also had minor warnings of mafia activity, nuclear material for sale, and an invasion from Mars.
>Because we don't see the cost of a general policy, we learn overly specific lessons. After September 11th, the FAA prohibited box-cutters on airplanes—as if the problem had been the failure to take this particular "obvious" precaution. We don't learn the general lesson: the cost of effective caution is very high because you must attend to problems that are not as obvious now as past problems seem in hindsight.
The letter demonstrates foresight, not hindsight; a very long list of equally severe and plausible warnings is needed to make the point that one is unlikely to single out this one letter without the benefit of hindsight.
This is the crux. This letter is clearly very ballsy in a corporate context, I doubt there is a stack of similar memos in the Morton Thiokol archives which if acted upon as a group would have bogged down the launch. But at a minimum, making this argument requires at least one similar example.
Without that alternate case, languorous management is the most likely culprit.
Yes, there is a serious gap of logic in the linked article. In fact, it appears that if the O-Rings were better checked and improved, as the letter indicated they must be, the Challenger would not have exploded as it did.
What were the other warning signs which seemed as severe as the O-ring problem without the benefit of hindsight?
Your quote claims that it would have been much more expensive to address all of these things, but what's the support for that statement?
My (extremely limited, to be sure) understanding is that there wasn't anything else that severe. The O-ring problem was dire, and the only reason it didn't get attention was because schedule pressure and go-fever caused management to minimize the problem.
Note that the warning signs mentioned in the quote were not just theoretical test results, but also several cases where fire was getting where it was not supposed to be during actual flights. Further, there was strong evidence that these incidents were linked to cold weather. "Don't fly when it's really cold" wouldn't have been a particularly expensive mitigation, either, even if you argue that suspending flights until the problem was investigated and fixed was not reasonable. I mean, you're launching from Florida, a restriction not to launch in temperatures under 50F wouldn't have been a terrible burden.
How many other warning sings were as severe as the O-rings and as easy to mitigate? My money is on "zero" in which case this argument is just so much noise.
> My (extremely limited, to be sure) understanding is that there wasn't anything else that severe.
That's my understanding as well. Boisjoly wrote a detailed essay describing the thought process that they went through to arrive at their conclusion that the O-rings were a critical problem (the other three authors are a professor and two students at RIT):
The essay is a response to a criticism by Edward Tufte that the engineers did not present the issue properly during the telcon the night before the launch, and that this was the reason that NASA did not agree to scrub the launch. It makes clear that, at least in the minds of the engineers, there was no other issue even close to being this critical. Granted, they were concerned with only one subsystem, the SRBs--but the fact that this issue also dominated the telcon the night before the launch indicates that there was no other issue even close to being that critical in any other subsystem either.
[Edit: It's interesting, btw, that in the essay linked to above, Boisjoly accuses Tufte of precisely the thing Yudkowsky is talking about in his article: hindsight bias. He argues that Tufte did not recognize or allow for the fact that the engineers at the time had limited information, but talks as though they knew everything that was known in hindsight.]
This is a good read.
I do think the Tufte essay is excellent except for one detail. I don't think it's fair that he places blame on the the engineers for failing to convince NASA management to stop the launch. It is possible that better designed slides would have helped NASA management grasp the problem. But I think the fault is more so on them.
The premise of "failing to convince management" relies on the notion that "management can do no wrong". For example, if you tell your boss that we're about to drive the bus off a cliff, and he doesn't listen, it's your fault.
This is the way things work in many organizations, it's an old-school approach to management, but to me it is purely dysfunctional. When someone tells me I'm about to drive off a cliff, I say, "Wow, thanks! What did I fail to see that got us into this mess in the first place?"
Aside from Tufte's notion of blame, I think his essay is very instructive.
> It is possible that better designed slides would have helped NASA management grasp the problem. But I think the fault is more so on them.
I completely agree, and AFAIK the Tufte essay does not even mention a critical point in this regard, which the Boisjoly paper does. The engineers had already tried to get all flights stopped until the O ring issue could be investigated and properly understood, in the August before the Challenger flight. NASA had refused. So the NASA managers were already aware that there was a critical flight risk, and they had already chosen to ignore it. That means it definitely wasn't a case of bringing new information to management's attention in order to drive a decision. It was a case of trying to get them to change, at least in part, a decision they had already made.
The argument about cold temperatures has to be viewed in that light--it was an attempt to find something, in the absence of good, hard data and a solid understanding of what was going on, that would at least get NASA to delay some flights, since they had already refused to delay all flights. In fact, considered solely on engineering grounds, the argument about cold temperatures was fairly weak (as Boisjoly points out in his essay). But it was weak not because cold temperature flights were almost as safe as warm temperature flights; it was weak because warm temperature flights were almost as unsafe as cold temperature flights! (A previous flight with significant blow-by had been made at a temperature of 75 F, and test stand data showed that the O-rings were not sealing completely at any temperature below 100 F.) But the engineers couldn't say that the night before the Challenger flew, because they had already said it back in August and had been ignored.
> I do think the Tufte essay is excellent except for one detail.
I don't think that one detail is the only serious flaw in the essay. As far as I can tell, Boisjoly's criticisms--that Tufte misunderstood the actual issue (it was blow-by, not erosion), and that he misunderstood what information the engineers did and did not have (for example, they didn't have reliable temperature data for many flights)--are valid.
> "Don't fly when it's really cold" wouldn't have been a particularly expensive mitigation, either
The SRBs were not designed to work below 40F, so it would have been very reasonable to postpone launch (yet again). However, Reagan was going to give his State of the Union speech on that same evening, and putting a teacher in space was supposed to be a highlight of the speech...
I think your comment is great, but I have to disagree with one part in particular. I don't agree that it was a "some minor warning" when President Clinton told President Bush explicitly; "In his campaign, Bush had said he thought the biggest security issue was Iraq and a national missile defense. I told him that in my opinion, the biggest security problem was Osama bin Laden."
"Osama Bin Laden is dangerous" is pretty far from actionable intelligence, though. I think it's an example of hindsight bias to think that if President Bush had been more prudent in dealing with Bin Laden then the 9/11 attacks would have been prevented.
When the current Commander in Chief of the United States of America, the most powerful military in the world, says "the biggest security problem [is] Osama bin Laden," that's something entirely different.
And frankly, I trust the opinion of Richard Clarke more than I trust yours OR MINE.
To be fair, we spent most of the effort fighting in the wrong country (Iraq) chasing the wrong person (Saddam Hussein) who had nothing to do with orchestrating the attacks, nor had any viable "weapons of mass destruction" that were the purported reason for going after him in the first place.
That was 2 years later and had nothing to do with the search for bin Laden. Who they put out a huge search effort for, and continued doing so for the next 10 years.
Personally, I'd be concerned about capturing someone who was responsible for ordering the terrorist attacks that killed thousands of people, but I guess everyone's different.
It's not black and white. If President Bush had been more prudent dealing with Bin Laden, then the 9/11 attacks would certainly have been more unlikely to succeed.
To believe otherwise is to believe, essentially, in predetermination. Hindsight bias does not imply that things would have turned out the same today, no matter what people did in the past.
Right, what I should have written was "I think it's an example of hindsight bias to think that if President Bush had been more prudent in dealing with Bin Laden then the 9/11 attacks would certainly have been prevented.
>Viewing history through the lens of hindsight, we vastly underestimate the cost of effective safety precautions. In 1986, the Challenger exploded for reasons traced to an O-ring losing flexibility at low temperature. There were warning signs of a problem with the O-rings. But preventing the Challenger disaster would have required, not attending to the problem with the O-rings, but attending to every warning sign which seemed as severe as the O-ring problem, without benefit of hindsight. It could have been done, but it would have required a general policy much more expensive than just fixing the O-Rings.
>Shortly after September 11th 2001, I thought to myself, and now someone will turn up minor intelligence warnings of something-or-other, and then the hindsight will begin. Yes, I'm sure they had some minor warnings of an al Qaeda plot, but they probably also had minor warnings of mafia activity, nuclear material for sale, and an invasion from Mars.
>Because we don't see the cost of a general policy, we learn overly specific lessons. After September 11th, the FAA prohibited box-cutters on airplanes—as if the problem had been the failure to take this particular "obvious" precaution. We don't learn the general lesson: the cost of effective caution is very high because you must attend to problems that are not as obvious now as past problems seem in hindsight.