Thursday, November 20, 2014

The McKinsey Quarterly at 50 Years



The Quarterly’s mission is to help define the senior management agenda; this anniversary issue* is focused on McKinsey’s vision for the future of management. (p. 1)  The issue is organized around several themes (strategy, productivity, etc.) but we’re interested in how it addresses culture.  The word appears in several articles, but usually in passing or in a way not readily applied to nuclear safety culture.  There were, however, a few interesting tidbits.  

One article focused on artificial intelligence as a sweeping technological change with exponential impacts on business.  An interviewee opined that current senior management culture based on domain expertise will need to give way to becoming data-driven.  “[D]ata expertise is at least as important [as domain expertise] and will become exponentially more important.  So this is the trick.  Data will tell you what’s really going on, whereas domain expertise will always bias you toward the status quo, and that makes it very hard to keep up with these disruptions.” (p. 73)  Does the culture of the nuclear industry ignore or undervalue disruptions of all types because they may threaten the status quo?

McKinsey’s former managing director listed several keys to corporate longevity, including “creating a culture of dissatisfaction with current performance, however good” and “focus[ing] relentlessly on values . . . A company’s values are judged by actions and behavior, not words and mission statements.” (pp. 121-22)  The first point reinforces the concept of a learning organization; the second the belief that behavior, e.g., the series of decisions made in an organization, is culture-in-action.  Any design for a strong safety culture should consider both.

Lou Gerstner (the man who saved IBM) also had something to say about values in action: “The rewards system is a powerful driver of behavior and therefore culture. Teamwork is hard to cultivate in a world where employees are paid solely on their individual performance.” (p. 126)  We have long argued that executive compensation schemes that pay more for production or cost control than safety send an accurate, although inappropriate, signal of what’s really important throughout the organization.

Finally, management guru Tom Peters had some comments about leadership.  “If you take a leadership job, you do people.  Period.  It’s what you do. It’s what you’re paid to do.  People, period.  Should you have a great strategy?  Yes, you should.  How do you get a great strategy?  By finding the world’s greatest strategist, not by being the world’s greatest strategist.  You do people.  Not my fault.  You chose it.  And if you don’t get off on it, do the world a favor and get the hell out before dawn, preferably without a gilded parachute.  But if you want the gilded parachute, it’s worth it to get rid of you.” (p. 93)  Too simplistic?  Probably, but the point that senior managers have to spend significant time identifying, developing and keeping the most qualified people is well-taken.

Our Perspective

None of this is groundbreaking news.  But in a world awash in technology innovations and “big data” it’s interesting that one of the world’s foremost management consulting practices still sees a major role for culture in management’s future.


*  McKinsey Quarterly, no. 3 (2014).

Monday, November 17, 2014

NRC Chairman Macfarlane's Speech to the National Press Club




As you know, Chairman Allison Macfarlane will be leaving the NRC and starting a new academic job in January.  Today she made a relatively lengthy speech* reviewing her tenure at NRC.  Her remarks touched on all the NRC’s major work areas, including the following comments on safety culture (SC).

In the area of current plant performance, she expressed a concern that the lowest performing plants seem to stay in that category for extended periods rather than fixing their problems and moving on.  She says “Poor management is easy to spot from the lack of safety culture and other persistent problems at plants.  I believe that solid leadership from the top – and not just attention to the bottom line – is necessary to ensure consistent plant performance.” (p. 5)  While we believe leadership is a necessary (but not sufficient) condition for success, her general observation is similar to what we saw back in the “problem plant” era of the 1990s.  A significant difference is there are far fewer plants in trouble these days.

Under new plant construction she observed that “today’s component manufacturers have had to adjust their safety culture practices to accommodate the rigorous, often unique, requirements presented by nuclear construction.  Some parts of the industry continue to struggle with these issues.” (p. 5)

At the NRC’s Regulatory Information Conference (RIC) back in March, three entities (two plants, one contractor) that have been in trouble because of SC issues made presentations detailing their problems and corrective actions.  We reviewed their RIC presentations on April25, 2014.

Our Perspective

As a matter of course, speeches like this emphasize the positive and the progress but it is interesting to note all the activities in which the NRC has its fingers.  It’s worth flipping through the pages just to reinforce that perspective.


*  Prepared Remarks of Chairman Allison M. Macfarlane, National Press Club, Washington, DC (Nov. 17, 2014).

Monday, November 3, 2014

A Life In Error by James Reason



Most of us associate psychologist James Reason with the “Swiss Cheese Model” of defense in depth or possibly the notion of a “just culture.”  But his career has been more than two ideas, he has literally spent his professional life studying errors, their causes and contexts.  A Life In Error* is an academic memoir, recounting his study of errors starting with the individual and ending up with the organization (the “system”) including its safety culture (SC).  This post summarizes relevant portions of the book and provides our perspective.  It is going to read like a sub-titled movie on fast-forward but there are a lot of particulars packed in this short (124 pgs.) book. 

Slips and Mistakes 

People make plans and take action, consequences follow.  Errors occur when the intended goals are not achieved.  The plan may be adequate but the execution faulty because of slips (absent-mindedness) or trips (clumsy actions).  A plan that was inadequate to begin with is a mistake which is usually more subtle than a slip, and may go undetected for long periods of time if no obviously bad consequences occur. (pp. 10-12)  A mistake is a creation of higher-level mental activity than a slip.  Both slips and mistakes can take “strong but wrong” forms, where schema** that were effective in prior situations are selected even though they are not appropriate in the current situation.

Absent-minded slips can occur from misapplied competence where a planned routine is sidetracked into an unplanned one.  Such diversions can occur, for instance, when one’s attention is unexpectedly diverted just as one reaches a decision point and multiple schema are both available and actively vying to be chosen. (pp. 21-25)  Reason’s recipe for absent-minded errors is one part cognitive under-specification, e.g., insufficient knowledge, and one part the existence of an inappropriate response primed by prior, recent use and the situational conditions. (p. 49) 

Planning Biases 

The planning activity is subject to multiple biases.  An individual planner’s database may be incomplete or shaped by past experiences rather than future uncertainties, with greater emphasis on past successes than failures.  Planners can underestimate the influence of chance, overweight data that is emotionally charged, be overly influenced by their theories, misinterpret sample data or miss covariations, suffer hindsight bias or be overconfident.***  Once a plan is prepared, planners may focus only on confirmatory data and are usually resistant to changing the plan.  Planning in a group is subject to “groupthink” problems including overconfidence, rationalization, self-censorship and an illusion of unanimity.  (pp. 56-62)

Errors and Violations 

Violations are deliberate acts to break rules or procedures, although bad outcomes are not generally intended.  Violations arise from various motivational factors including the organizational culture.  Types of violations include corner-cutting to avoid clumsy procedures, necessary violations to get the job done because the procedures are unworkable, adjustments to satisfy conflicting goals and one-off actions (such as turning off a safety system) when faced with exceptional circumstances.  Violators perform a type of cost:benefit analysis biased by the fact that benefits are likely immediate while costs, if they occur, are usually somewhere in the future.  In Reason’s view, the proper course for the organization is to increase the perceived benefits of compliance not increase the costs (penalties) for violations.  (There is a hint of the “just culture” here.) 

Organizational Accidents 

Major accidents (TMI, Chernobyl, Challenger) have three common characteristics: contributing factors that were latent in the system, multiple levels of defense, and an unforeseen combination of latent factors and active failures (errors and/or violations) that defeated the defenses.  This is the well-known Swiss Cheese Model with the active failures opening short-lived holes and latent failures creating longer-lasting but unperceived holes.

Organizational accidents are low frequency, high severity events with causes that may date back years.  In contrast, individual accidents are more frequent but have limited consequences; they arise from slips, trips and lapses.  This is why organizations can have a good industrial accident record while they are on the road to a large-scale disaster, e.g., BP at Texas City. 

Organizational Culture 

Certain industries, including nuclear power, have defenses-in-depth distributed throughout the system but are vulnerable to something that is equally widespread.  According to Reason, “The most likely candidate is safety culture.  It can affect all elements in a system for good or ill.” (p. 81)  An inadequate SC can undermine the Swiss Cheese Model: there will be more active failures at the “sharp end”; more latent conditions created and sustained by management actions and policies, e.g., poor maintenance, inadequate equipment or downgrading training; and the organization will be reluctant to deal proactively with known problems. (pp. 82-83)

Reason describes a “cluster of organizational pathologies” that make an adverse system event more likely: blaming sharp-end operators, denying the existence of systemic inadequacies, and a narrow pursuit of production and financial goals.  He goes on to list some of the drivers of blame and denial.  The list includes: accepting human error as the root cause of an event; the hindsight bias; evaluating prior decisions based on their outcomes; shooting the messenger; belief in a bad apple but not a bad barrel (the system); failure to learn; a climate of silence; workarounds that compensate for systemic inadequacies’ and normalization of deviance.  (pp. 86-92)  Whew! 

Our Perspective 

Reason teaches us that the essence of understanding errors is nuance.  At one end of the spectrum, some errors are totally under the purview of the individual, at the other end they reside in the realm of the system.  The biases and issues described by Reason are familiar to Safetymatters readers and echo in the work of Dekker, Hollnagel, Kahneman and others.  We have been pounding the drum for a long time on the inadequacies of safety analyses that ignore systemic issues and corrective actions that are limited to individuals (e.g., more training and oversight, improved procedures and clearer expectations).

The book is densely packed with the work of a career.  One could easily use the contents to develop a SC assessment or self-assessment.  We did not report on the chapters covering research into absent-mindedness, Freud and medical errors (Reason’s current interest) but they are certainly worth reading.

Reason says this book is his valedictory: “I have nothing new to say and I’m well past my prime.” (p. 122)  We hope not.


*  J. Reason, A Life In Error: From Little Slips to Big Disasters (Burlington, VT: Ashgate, 2013).

**  Knowledge structures in long-term memory. (p. 24)

***  This will ring familiar to readers of Daniel Kahneman.  See our Dec. 18, 2013 post on Kahneman’s Thinking, Fast and Slow.