A friend of mine from ITLP posted a great item about managing risk, in response to an op-ed essay, Drilling for Certainty, by David Brooks in the New York Times. Since we often discuss balancing risk with the need for change, I thought I'd repost Tom Holub's comments here:
Brooks makes six important points after observing "Humans are not great at measuring and responding to risk when placed in situations too complicated to understand."
As I read through this editorial my thinking was challenged by this list as well as Brooks' quote from Malcolm Gladwell's 1996 New Yorker essay: "We have constructed a world in which the potential for high-tech catastrophe is embedded in the fabric of day-to-day life." So. in the words Brooks uses to close the essay, "It's a challenge for people living in an imponderably complex technical society."
The haunting question for me is simply What are we going to do about it?
This article mentions Richard Feynman's scathing report on risk assessment related to the Challenger disaster; it certainly bears reading if you haven't seen it already:
It appears that there are enormous differences of opinion as to the probability of a failure with loss of vehicle and of human life. The estimates range from roughly 1 in 100 to 1 in 100,000. The higher figures [1:100] come from the working engineers, and the very low figures [1:100,000] from management. What are the causes and consequences of this lack of agreement? Since 1 part in 100,000 would imply that one could put a Shuttle up each day for 300 years expecting to lose only one, we could properly ask "What is the cause of management's fantastic faith in the machinery?"Of course, Feynman's suggestion that the failure rate would be closer to 1 in 100 than 1 in 100,000 now seems correct; the Columbia disaster was 78 missions after Challenger, and various others have had dangerous complications.
We have also found that certification criteria used in Flight Readiness Reviews often develop a gradually decreasing strictness. The argument that the same risk was flown before without failure is often accepted as an argument for the safety of accepting it again. Because of this, obvious weaknesses are accepted again and again, sometimes without a sufficiently serious attempt to remedy them, or to delay a flight because of their continued presence.
I suppose the leadership lesson is that the courageous action may be to not soldier forward against all odds, but to advocate for caution when the risks have not been properly assessed. On the other hand, change-averse folks often use the existence of unquantified risks to advocate for the status quo, when the status quo often has high levels of known risk associated with it.