As a scholar who has examined a number of catastrophic failures, I have been tracking the BP oil spill with great interest. I hope to be able to write an in-depth case study about the failure. With details still coming out about the events leading up to the catastrophe, I'm not yet ready to draw any conclusions. However, I can offer a review of a few of the major conclusions from my research, as well as others' work, on prior catastrophic failures in a broad range of fields stretching from space travel to health care.
1. Catastrophic failures generally do not have a single root cause. They are typically the result of a chain of errors, mistakes, and small failures.
2. People and organizations often downplay ambiguous threats, i.e. warning signs, that crop out in the days, weeks, and months prior to the catastrophe.
3. Organizations often have cultures that don't promote sufficient candor and open dialogue. Thus, people with knowledge about critical risks may not speak up about their concerns regarding a potential failure.
4. People with intuitive concerns about certain risks sometimes are dismissed because they lack extensive data to support their concerns.
5. Organizations often overestimate how human and system redundancy they have in place to protect them from catastrophe.
6. People often underestimate the probability of what they perceive to be extremely low probability events.
7. Cognitive biases often distort managerial judgments, contributing to catastrophe.