Source: www.worldatlas.com |
Don Norman has written an outstanding article for Fast Company about the false alert that caused panic in Hawaii over the weekend. In the aftermath of the incident, we heard that "human error" caused the false alert to be transmitted widely to citizens of the state. Norman challenges this initial conclusion. Norman writes:
When some error occurs, it is commonplace to look for the reason. In serious cases, a committee is formed which more or less thoroughly tries to determine the cause. Eventually, it will be discovered that a person did something wrong. “Hah,” says the investigation committee. “Human error. Increase the training. Punish the guilty person.” Everyone feels good. The public is reassured. An innocent person is punished, and the real problem remains unfixed. The correct response is for the committee to ask, “What caused the human error? How could that have been prevented?” Find the root cause and then cure that. To me, the most frustrating aspect of these errors is that they result from poor design. Incompetent design. Worse, for decades we have known how proper, human-centered design can prevent them.
Norman points out several egregious design flaws with this alert system. First, why was a confirmation not required before the alert was sent? Ideally, he notes, the confirmation should be provided by a second person working independently from the person who selected the alert message. Second, when operating in test mode, the messages should all start with a clear indication that it is only a test. That should be in bold! It should be capitalized! It should be crystal clear! Finally, the system should be designed to enable immediate correction. The delay was preventable with better design.
In sum, you can look at any failure in two contrasting ways. You can examine it individualistically, i.e. it is human error. Or, you can look at it systemically, i.e. what systems, procedures, and situational factors contributed to poor actions or decisions? The latter approach is much more likely to lead to learning, improvement, and future accident prevention. We have shown that in our own research on tragic accidents such as the Columbia space shuttle accident.
1 comment:
Exactly. Processes and procedures should be in place to minimize the opportunities for human error. Dr Deming demonstrated this years ago.
Post a Comment