Monday, December 3, 2018

Nuclear Safety Culture: Lessons from Factfulness by Hans Rosling

This book* is about biases that prevent us from making fact-based decisions.  It is based on the author’s world-wide work as a doctor and public health researcher.  We saw it on Bill Gates’ 2018 summer reading list.

Rosling discusses ten instincts (or reasons) why our individual worldviews (or mental models) are systematically wrong and prevent us from seeing situations are they truly are and making fact-based decisions about them.

Rosling mostly addresses global issues but the same instincts can affect our approach to work-related decision making from the enterprise level down to the individual.  We briefly discuss each instinct and highlight how it may hinder us from making good decisions during everyday work and one-off investigations.

The gap instinct

This is “that irresistible temptation we have to divide all kinds of things into two distinct and often conflicting groups, with an imagined gap—a huge chasm of injustice—in between.” (p. 26)  This is reinforced by our “strong dramatic instinct toward binary thinking . . .” (p. 42)  The gap instinct can apply to our thinking about safety, e.g., in the Safety I mental model there is acceptable performance and intolerable performance, with no middle ground and no normal transitions back and forth.  Rosling notes that usually there is no clear cleavage between two groups, even if it seems like that from the averages.  We saw this in Dekker's analysis of health provider data (reviewed Oct. 29, 2018) where both favorable and unfavorable patient outcomes exhibited the same negative work process traits.

The negativity instinct

This is “our tendency to notice the bad more than the good.” (p. 51)  We do not perceive  improvements that are “too slow, too fragmented, or too small one-by-one to ever qualify as news.” (p. 54)  “There are three things going on here: the misremembering of the past [erroneously glorifying the “good old days”]; selective reporting by journalists and activists; and the feeling that as long as things are bad it’s heartless to say they are getting better.” (p. 70)  To tell the truth, we don’t see this instinct inside the nuclear world where facilities with long-standing cultural problems (i.e., bad) are constantly reporting progress (i.e., getting better) while their cultural conditions still remain unacceptable.

The straight line instinct

This is the expectation that a line of data will continue straight into the future.  Most of you have technical training or exposure and know that accurate extrapolations can take many shapes including straight, s-bends, asymptotes, humps or exponential growth. 

The fear instinct

“[F]ears are hardwired deep in our brains for obvious evolutionary reasons.” (p. 105)  “The media cannot resist tapping into our fear instinct. It is such an easy way to grab our attention.” (p. 106)  Rosling observes that hundreds of elderly people who fled Fukushima to escape radiation ended up dying “because of the mental and physical stresses of the evacuation itself or of life in evacuation shelters.” (p. 114)  In other words, they fled something frightening (a perceived risk) and ended up in danger (a real risk).  How often does fear, e.g., fear of bad press, enter into your organization’s decision making?

The size instinct 


We overweight things that look big to us.  “It is instinctive to look at a lonely number and misjudge its importance.  It is also instinctive . . . to misjudge the importance of a single instance or an identifiable victim.” (p. 125)  Does the nuclear industry overreact to some single instances?

The generalization instinct

“[T]he generalization instinct makes “us” think of “them” as all the same.” (p. 140)  At the macro level, this is where the bad “isms” exist: racism, sexism, ageism, classism, etc.  But your coworkers may practice generalization on a more subtle, micro level.  How many people do you work with who think the root cause of most incidents is human error?  Or somewhat more generously, human error, inadequate procedures and/or equipment malfunctions— but not the larger socio-technical system?  Do people jump to conclusions based on an inadequate or incorrect categorization of a problem?  Are categories, rather than facts, used as explanations?  Are vivid examples used to over-glamorize alleged progress or over-dramatize poor outcomes?

The destiny instinct

“The destiny instinct is the idea that innate characteristics determine the destinies of people, countries, religions, or cultures.” (p. 158)  Culture includes deep-seated beliefs, where feelings can be disguised as facts.  Does your work culture assume that some people are naturally bad apples?

The single perspective instinct

This is preference for single causes and single solutions.  It is the fundamental weakness of Safety I where the underlying attitude is that problems arise from individuals who need to be better controlled.  Rosling advises us to “Beware of simple ideas and simple solutions. . . . Welcome complexity.” (p. 189)  We agree.

The blame instinct

“The blame instinct is the instinct to find a clear, simple reason for why something bad has happened. . . . when things go wrong, it must be because of some bad individual with bad intentions. . . . This undermines our ability to solve the problem, or prevent it from happening again, . . . To understand most of the world’s significant problems we have to look beyond a guilty individual and to the system.” (p. 192)  “Look for causes, not villains. When something goes wrong don’t look for an individual or a group to blame. Accept that bad things can happen without anyone intending them to.  Instead spend your energy on understanding the multiple interacting causes, or system, that created the situation.  Look for systems, not heroes.” (p. 204)  We totally agree with Rosling’s endorsement of a systems approach.

The urgency instinct

“The call to action makes you think less critically, decide more quickly, and act now.” (p. 209)  In a true emergency, people will fall back on their training (if any) and hope for the best.  However, in most situations, you should seek more information.  Beware of data that is relevant but inaccurate, or accurate but irrelevant.  Be wary of predictions that fail to acknowledge that the future is uncertain.

Our Perspective

The series of decisions an organization makes is a visible artifact of its culture and its decision making process internalizes culture.  Because of this linkage, we have long been interested in how organizations and individuals can make better decisions, where “better” means fact- and reality-based and consistent with the organization’s mission and espoused values.

We have reviewed many works that deal with decision making.  This book adds value because it is based on the author’s research and observations around the world; it is not based on controlled studies in a laboratory or observations in a single organization.  It uses very good graphics to illustrate various data sets, including changes, e.g., progress, over time.

Rosling believed “it has never been easier or more important for business leaders and employees to act on a fact-based worldview.” (p. 228)   His book is engagingly written and easy to read.  It is Rosling’s swan song; he died in 2017.

Bottom line: Rosling advocates for robust decision making, accurate mental models, and a systems approach.  We like it.


*  H. Rosling, O. Rosling and A.R. Rönnlund, Factfulness, 1st ed. ebook (New York: Flatiron, 2018).

No comments:

Post a Comment

Thanks for your comment. We read them all. The moderator will publish comments that are related to our content.