Comment Re:Engineers bullied or bamboozled into acquiescen (Score 1) 629
You should read the reports, particularly Feynman's. The technical people and the management were on diverging paths of bamboozlement.
The management, yes, overrode some technical people who said the risk was too high. They estimated the risk as tiny and represented it as tinier, as if the numbers were only meant to impress rather than indicating an actual mathematical quantity.
But they did it because some of the technical people said the risk was not too high. They deluded themselves to believe that it was OK to accept a higher risk because previous trials had not caused failure. This is the same syndrome that leads a roulette player to go bust, just because he won a few times in a row. True, not all the techs fell prey to it, but enough did that management got the impression that conflicting opinions of the risk existed (everyone on the technical side knew that the overall risk was around 1-2 percent, they differed about whether it was acceptable). Engineers raised the "success criterion" for tests to match their data when their tests failed, because the failure in that trial was not catastrophic, without understanding the root cause of the poor performance and the real odds of a catastrophic failure. As a result, components passed their tests in a state that initially would have cause a mission scrub due to an aggregate risk that was too high, and went into service without an assessment of true risk. The foam thing was another example of "it hasn't killed anyone yet, ignore it".
After a history of doing things this way, and a history of good luck, it became easier for management to ignore the warnings of higher risk and push forward. The message to take home is to treat every risk seriously, recognize that 1% chance of failure is not small - it means that you *will* fail one day if you keep trying.