Friday, December 28, 2012

Uh-oh, Delays at Vogtle

This Wall Street Journal article* reports that the new Vogtle units may be in construction schedule trouble. The article notes that the new, modular construction techniques being employed were expected to save time and dollars but may be having the opposite effect. In addition, and somewhat incredibly, the independent monitor is citing design changes as another cause of delays. Thought that lesson had been learned a hundred times in the nuclear industry.

Then there is the inevitable finger pointing:

“The delays and cost pressures have created friction between the construction partners and utility companies that will serve as the plant's owners, escalating into a series of lawsuits totaling more than $900 million.”

The Vogtle situation also serves as a reminder that nuclear safety culture (NSC) is applicable to the construction phase though to our recollection, there was not a lot of talk about it during the NRC’s policy statement development process. The escalating schedule and cost pressures at Vogtle also serve to remind us of how significant a factor such pressures can be in a “massive, complex, first-of-a-kind project” (to quote the Westinghouse spokesman). These situational conditions will be challenging construction workers and management who may not possess the same level of NSC experience or consciousness as nuclear operating organizations.

* R. Smith, “New Nuclear Plant HitsSome Snags,” Wall Street Journal online (Dec. 23, 2012).

Thursday, December 20, 2012

The Logic of Failure by Dietrich Dörner

This book was mentioned in a nuclear safety discussion forum so we figured this is a good time to revisit Dörner's 1989 tome.* Below we provide a summary of the book followed by our assessment of how it fits into our interest in decision making and the use of simulations in training.

Dörner's work focuses on why people fail to make good decisions when faced with problems and challenges. In particular, he is interested in the psychological needs and coping mechanisms people exhibit. His primary research method is observing test subjects interact with simulation models of physical sub-worlds, e.g., a malfunctioning refrigeration unit, an African tribe of subsistence farmers and herdsmen, or a small English manufacturing city. He applies his lessons learned to real situations, e.g, the Chernobyl nuclear plant accident.

He proposes a multi-step process for improving decision making in complicated situations then describes each step in detail and the problems people can create for themselves while executing the step. These problems generally consist of tactics people adopt to preserve their sense of competence and control at the expense of successfully achieving overall objectives. Although the steps are discussed in series, he recognizes that, at any point, one may have to loop back through a previous step.

Goal setting

Goals should be concrete and specific to guide future steps. The relationships between and among goals should be specified, including dependencies, conflicts and relative importance. When people don't to do this, they can become distracted by obvious or unimportant (although potentially achievable) goals, or peripheral issues they know how to address rather than important issues that should be resolved. Facing performance failure, they may attempt to turn failure into success with doublespeak or blame unseen forces.

Formulate models and gather information

Good decision-making requires an adequate mental model of the system being studied—the variables that comprise the system and the functional relationships among them, which may include positive and negative feedback loops. The model's level of detail should be sufficient to understand the interrelationships among the variables the decision maker wants to influence. Unsuccessful test subjects were inclined to use a “reductive hypothesis,” which unreasonably reduces the model to a single key variable, or overgeneralization.

Information gathered is almost always incomplete and the decision maker has to decide when he has enough to proceed. The more successful test subjects asked more questions and made fewer decisions (then the less successful subjects) in the early time periods of the sim.

Predict and extrapolate

Once a model is formulated, the decision maker must attempt to determine how the values of variables will change over time in response to his decisions or internal system dynamics. One problem is predicting that outputs will change in a linear fashion, even as the evidence grows for a non-linear, e.g., exponential function. An exponential variable may suddenly grow dramatically then equally suddenly reverse course when the limits on growth (resources) are reached. Internal time delays mean that the effects of a decision are not visible until some time in the future. Faced with poor results, unsuccessful test subjects implement or exhibit “massive countermeasures, ad hoc hypotheses that ignore the actual data, underestimations of growth processes, panic reactions, and ineffectual frenetic activity.” (p. 152) Successful subjects made an effort to understand the system's dynamics, kept notes (history) on system performance and tried to anticipate what would happen in the future.

Plan and execute actions, check results and adjust strategy

The essence of planning is to think through the consequences of certain actions and see whether those actions will bring us closer to our desired goal.” (p. 153) Easier said than done in an environment of too many alternative courses of action and too little time. In rapidly evolving situations, it may be best to create rough plans and delegate as many implementing decisions as possible to subordinates. A major risk is thinking that planning has been so complete than the unexpected cannot occur. A related risk is the reflexive use of historically successful strategies. “As at Chernobyl, certain actions carried out frequently in the past, yielding only the positive consequences of time and effort saved and incurring no negative consequences, acquire the status of an (automatically applied) ritual and can contribute to catastrophe.” (p. 172)

In the sims, unsuccessful test subjects often exhibited “ballistic” behavior—they implemented decisions but paid no attention to, i.e, did not learn from, the results. Successful subjects watched for the effects of their decisions, made adjustments and learned from their mistakes.

Dörner identified several characteristics of people who tended to end up in a failure situation. They failed to formulate their goals, didn't recognize goal conflict or set priorities, and didn't correct their errors. (p. 185) Their ignorance of interrelationships among system variables and the longer-term repercussions of current decisions set the stage for ultimate failure.


Dörner's insights and models have informed our thinking about human decision-making behavior in demanding, complicated situations. His use and promotion of simulation models as learning tools was one starting point for Bob Cudlin's work in developing a nuclear management training simulation program. Like Dörner, we see simulation as a powerful tool to “observe and record the background of planning, decision making, and evaluation processes that are usually hidden.” (pp. 9-10)

However, this book does not cover the entire scope of our interests. Dörner is a psychologist interested in individuals, group behavior is beyond his range. He alludes to normalization of deviance but his references appear limited to the flaunting of safety rules rather than a more pervasive process of slippage. More importantly, he does not address behavior that arises from the system itself, in particular adaptive behavior as an open system reacts to and interacts with its environment.

From our view, Dörner's suggestions may help the individual decision maker avoid common pitfalls and achieve locally optimum answers. On the downside, following Dörner's prescription might lead the decision maker to an unjustified confidence in his overall system management abilities. In a truly complex system, no one knows how the entire assemblage works. It's sobering to note that even in Dörner's closed,** relatively simple models many test subjects still had a hard time developing a reasonable mental model, and some failed completely.

This book is easy to read and Dörner's insights into the psychological traps that limit human decision making effectiveness remain useful.

* D. Dörner, The Logic of Failure: Recognizing and Avoiding Error in Complex Situations, trans. R. and R. Kimber (Reading, MA: Perseus Books, 1998). Originally published in German in 1989.

** One simulation model had an external input.

Wednesday, December 12, 2012

“Overpursuit” of Goals

We return to a favorite subject, the impact of goals and incentives on safety culture and performance. Interestingly this subject comes up in an essay by Oliver Burkeman, “The Power of Negative Thinking,”* which may seem unusual as most people think of goals and achievement of goals as the product of a positive approach. Traditional business thinking is to set hard, quantitative goals, the bigger the better. But futures are inherently uncertain and goals generally are not so. The counter intuitive argument suggests the most effective way to address future performance is to focus on worst case outcomes. Burkeman observes that “...rigid goals may encourage employees to cut ethical corners” and “Focusing on one goal at the expense of all other factors also can distort a corporate mission or an individual life…” and result in “...the ‘overpursuit’ of goals…” Case in point, yellow jerseys.

This raises some interesting points for nuclear safety. First we would remind our readers of Snowden’s Cynefin decision context framework, specifically his “complex” space which is indicative of where nuclear safety decisions reside. In this environment there are many interacting causes and effects, making it difficult or impossible to pursue specific goals along defined paths. Clearly an uncertain landscape. As Simon French argues: “Decision support will be more focused on exploring judgement and issues, and on developing broad strategies that are flexible enough to accommodate changes as the situation evolves.”** This would suggest the pursuit of specific, aspirational goals may be misguided or counterproductive.

Second, safety performance goals are hard to identify anyway. Is it the absence of bad outcomes? Or the maintenance of, say, a “strong” safety culture - whatever that is. One indication of the elusiveness of safety goals is their absence as targets in incentive programs. So there is probably little likelihood of overemphasizing safety performance as a goal. But is the same true for operational type goals such as capacity factor, refuel outage durations, and production costs? Can an overly strong focus on such short term goals, often associated with stretching performance, lead to overpursuit? What if large financial incentives are attached to the achievement of the goals?

The answer is not: “Safety is our highest priority”. More likely it is an approach that considers the complexity and uncertainty of nuclear operating space and the potential for hard goals to cut both ways. It might value how a management team prosecutes its responsibilities more than the outcome itself.

* O. Burkeman, “The Power of Negative Thinking,” Wall Street Journal online (Dec. 7, 2012).

** S. French, “Cynefin: repeatability, science and values,” Newsletter of the European Working Group “Multiple Criteria Decision Aiding,” series 3, no. 17 (Spring 2008) p. 2. We posted on Cynefin and French's paper here.

Wednesday, December 5, 2012

Drift Into Failure by Sydney Dekker

Sydney Dekker's Drift Into Failure* is a noteworthy effort to provide new insights into how accidents and other bad outcomes occur in large organizations. He begins by describing two competing world views, the essentially mechanical view of the world spawned by Newton and Descartes (among others), and a view based on complexity in socio-technical organizations and a systems approach. He shows how each world view biases the search for the “truth” behind how accidents and incidents occur.

Newtonian-Cartesian (N-C) Vision

Issac Newton and Rene Descartes were leading thinkers during the dawn of the Age of Reason. Newton used the language of mathematics to describe the world while Descartes relied on the inner process of reason. Both believed there was a single reality that could be investigated, understood and explained through careful analysis and thought—complete knowledge was possible if investigators looked long and hard enough. The assumptions and rules that started with them, and were extended by others over time, have been passed on and most of us accept them, uncritically, as common sense, the most effective way to look at the world.

The N-C world is ruled by invariant cause-and-effect; it is, in fact, a machine. If something bad happens, then there was a unique cause or set of causes. Investigators search for these broken components, which could be physical or human. It is assumed that a clear line exists between the broken part(s) and the overall behavior of the system. The explicit assumption of determinism leads to an implicit assumption of time reversibility—because system performance can be predicted from time A if we know the starting conditions and the functional relationships of all components, then we can start from a later time B (the bad outcome) and work back to the true causes. (p. 84) Root cause analysis and criminal investigations are steeped in this world view.

In this view, decision makers are expected to be rational people who “make decisions by systematically and consciously weighing all possible outcomes along all relevant criteria.” (p. 3) Bad outcomes are caused by incompetent or worse, corrupt decision makers. Fixes include more communications, training, procedures, supervision, exhortations to try harder and criminal charges.

Dekker credits Newton et al for giving man the wherewithal to probe Nature's secrets and build amazing machines. However, Newtonian-Cartesian vision is not the only way to view the world, especially the world of complex, socio-technical systems. For that a new model, with different concepts and operating principles, is required.

The Complex System


The sheer number of parts does not make a system complex, only complicated. A truly complex system is open (it interacts with its environment), has components that act locally and don't know the full effects of their actions, is constantly making decisions to maintain performance and adapt to changing circumstances, and has non-linear interactions (small events can cause large results) because of multipliers and feedback loops. Complexity is a result of the ever-changing relationships between components. (pp.138-144)

Adding to the myriad information confronting a manager or observer, system performance is often optimized at the edge of chaos, where competitors are perpetually vying for relative advantage at an affordable cost.** The system is constantly balancing its efforts between exploration (which will definitely incur costs but may lead to new advantages) and exploitation (which reaps benefits of current advantages but will likely dissipate over time). (pp. 164-165)

The most important feature of a complex system is that it adapts to its environment over time in order to survive. And its environment is characterized by resource scarcity and competition. There is continuous pressure to maintain production and increase efficiency (and their visible artifacts: output, costs, profits, market share, etc) and less visible outputs, e.g., safety, will receive less attention. After all, “Though safety is a (stated) priority, operational systems do not exist to be safe. They exist to provide a service or product . . . .” (p. 99) And the cumulative effect of multiple adaptive decisions can be an erosion of safety margins and a changed response of the entire system. Such responses may be beneficial or harmful—a drift into failure.

Drift by a complex system exhibits several characteristics. First, as mentioned above, it is driven by environmental factors. Second, drift occurs in small steps so changes can be hardly noticed, and even applauded if they result in local performance improvement; “. . . successful outcomes keep giving the impression that risk is under control” (p. 106) as a series of small decisions whittle away at safety margins. Third, these complex systems contain unruly technology (think deepwater drilling) where uncertainties exist about how the technology may be ultimately deployed and how it may fail. Fourth, there is significant interaction with a key environmental player, the regulator, and regulatory capture can occur, resulting in toothless oversight.

“Drifting into failure is not so much about breakdowns or malfunctioning of components, as it is about an organization not adapting effectively to cope with the complexity of its own structure and environment.” (p. 121) Drift and occasionally accidents occur because of ordinary system functioning, normal people going about their regular activities making ordinary decisions “against a background of uncertain technology and imperfect information.” Accidents, like safety, can be viewed as an emergent system property, i.e., they are the result of system relationships but cannot be predicted by examining any particular system component.

Managers' roles

Managers should not try to transform complex organizations into merely complicated ones, even if it's possible. Complexity is necessary for long-term survival as it maximizes organizational adaptability. The question is how to manage in a complex system. One key is increasing the diversity of personnel in the organization. More diversity means less group think and more creativity and greater capacity for adaptation. In practice, this means validation of minority opinions and encouragement of dissent, reflecting on the small decisions as they are made, stopping to ponder why some technical feature or process is not working exactly as expected and creating slack to reduce the chances of small events snowballing into large failures. With proper guidance, organizations can drift their way to success.


Amoral and criminal behavior certainly exist in large organizations but bad outcomes can also result from normal system functioning. That's why the search for culprits (bad actors or broken parts) may not always be appropriate or adequate. This is a point Dekker has explored before, in Just Culture (briefly reviewed here) where he suggests using accountability as a means to understand the system-based contributors to failure and resolve those contributors in a manner that will avoid recurrence.

Application to Nuclear Safety Culture

A commercial nuclear power plant or fleet is probably not a complete complex system. It interacts with environmental factors but in limited ways; it's certainly not directly exposed to the Wild West competition of say, the cell phone industry. Group think and normalization of deviance*** is a constant threat. The technology is reasonably well-understood but changes, e.g., uprates based on more software-intensive instrumentation and control, may be invisibly sanding away safety margin. Both the industry and the regulator would deny regulatory capture has occurred but an outside observer may think the relationship is a little too cozy. Overall, the fit is sufficiently good that students of safety culture should pay close attention to Dekker's observations.

In contrast, the Hanford Waste Treatment Plant (Vit Plant) is almost certainly a complex system and this book should be required reading for all managers in that program.


Drift Into Failure is not a quick read. Dekker spends a lot of time developing his theory, then circling back to further explain it or emphasize individual pieces. He reviews incidents (airplane crashes, a medical error resulting in patient death, software problems, public water supply contamination) and descriptions of organization evolution (NASA, international drug smuggling, “conflict minerals” in Africa, drilling for oil, terrorist tactics, Enron) to illustrate how his approach results in broader and arguably more meaningful insights than the reports of official investigations. Standing on the shoulders of others, especially Diane Vaughan, Dekker gives us a rich model for what might be called the “banality of normalization of deviance.” 

* S. Dekker, Drift Into Failure: From Hunting Broken Components to Understanding Complex Systems (Burlington VT: Ashgate 2011).

** See our Sept. 4, 2012 post onCynefin for another description of how the decisions an organization faces can suddenly slip from the Simple space to the Chaotic space.

*** We have posted many times about normalization of deviance, the corrosive organizational process by which the yesterday's “unacceptable” becomes today's “good enough.”

Thursday, November 29, 2012

The Mouse Runs Up the Clock (at Massey Energy)

We are all familiar with the old nursery rhyme: “Hickory, dickory, dock, the mouse ran up the clock.”  This may be an apt description for the rising waters of federal criminal prosecution in the Massey coal mine explosion investigation.  As reported in the Nov. 28, 2012 Wall Street Journal,* the former president of one of the Massey operating units unrelated to the Upper Big Branch mine has agreed to plead guilty to felony conspiracy charges including directing employees to violate safety laws.  The former president is cooperating with prosecutors (in other words, look out above) and as noted in the Journal article, “The expanded probe ‘strongly suggests’ prosecutors are ‘looking at top management’…"   Earlier this year, a former superintendent at the Upper Big Branch pleaded guilty to conspiracy charges. 

Federal prosecutors allege that safety rules were routinely violated to maximize profits.  As stated in the Criminal Information against the former president, “Mine safety and health laws were routinely violated at the White Buck Mines and at other coal mines owned by Massey, in part because of a belief that consistently following those laws would decrease coal production.” (Criminal Information, p. 4)**  The Information goes on to state:  “Furthermore, the issuance of citations and orders by MSHA [Mine Safety and Health Administration], particularly certain kinds of serious citations and orders, moved the affected mine closer to being classified as a mine with a pattern or potential pattern of violations.  That classification would have resulted in increased scrutiny of the affected mine by MSHA…” (Crim. Info. p.5)  Thus it is alleged that not only production priorities - the core objective of many businesses - but even the potential for increased scrutiny by a regulatory authority was sufficient to form the basis for a conspiracy. 

Every day managers and executives in high risk businesses make decisions to sustain and/or improve production and to minimize the exposure of the operation to higher levels of regulatory scrutiny.  The vast majority of those decisions are legitimate and don’t compromise safety or inhibit regulatory functions.  Extreme examples that do violate safety and legal requirements, such as the Massey case, are easy to spot.  But one might begin to wonder what exactly is the boundary separating legitimate pursuit of these objectives and decisions or actions that might (later) be interpreted as having the intent to compromise safety or regulation?  How important is perception to drawing the boundary - where the context can frame a decision or action in markedly different colors?  Suppose in the Massey situation, the former president instead of providing advance warnings and (apparently) explicitly tolerating safety violations, had limited the funding of safety activities, or just squeezed total budgets?  Same or different? 

*  K. Maher, "Mine-Safety Probe Expands," Wall Street Journal online (Nov. 28, 2012) may only be available only to subscribers.

**  U.S. District Court Southern District of West Virgina, “Criminal Information for Conspiracy to Defraud the United States: United States of America v. David C. Hughart” (Nov. 28, 2012).

Tuesday, November 20, 2012

BP/Deepwater Horizon: Upping the Stakes

Anyone who thought safety culture and safety decision making was an institutional artifact, or mostly a matter of regulatory enforcement, might want to take a close look at what is happening on the BP/Deepwater Horizon front these days. Three BP employees have been criminally indicted - and two of those indictments bear directly on safety in operational decisions. The indictments of the well-site leaders, the most senior BP personnel on the platform, accuses them of causing the deaths of 11 crewmen aboard the Deepwater Horizon rig in April 2010 through gross negligence, primarily by misinterpreting a crucial pressure test that should have alerted them that the well was in trouble.*

The crux of the matter relates to the interpretation of a pressure test to determine whether the well had been properly sealed prior to being temporarily abandoned. Apparently BP’s own investigation found that the men had misinterpreted the test results.

The indictment states, “The Well Site Leaders were responsible for...ensuring that well drilling operations were performed safely in light of the intrinsic danger and complexity of deepwater drilling.” (Indictment p.3)

The following specific actions are cited as constituting gross negligence: “...failed to phone engineers onshore to advise them ...that the well was not secure; failed to adequately account for the abnormal readings during the testing; accepted a nonsensical explanation for the abnormal readings, again without calling engineers onshore to consult…” (Indictment p.7)

The willingness of federal prosecutors to advance these charges should (and perhaps are intended to) send a chill down every manager’s spine in high risk industries. While gross negligence is a relatively high standard, and may or may not be provable in the BP case, the actions cited in the indictment may not sound all that extraordinary - failure to consult with onshore engineers, failure to account for “abnormal” readings, accepting a “nonsensical” explanation. Whether this amounts to “reckless” or willful disregard for a known risk is a matter for the legal system. As an article in the Wall Street Journal notes, “There were no federal rules about how to conduct such a test at the time. That has since changed; federal regulators finalized new drilling rules last week that spell out test procedures.”**

The indictment asserts that the men violated the “standard of care” applicable to the deepwater oil exploration industry. One might ponder what federal prosecutors think the “standard of care” is for the nuclear power generation industry.

Clearly the well site leaders made a serious misjudgment - one that turned out to have catastrophic consequences. But then consider the statement by the Assistant Attorney General, that the accident was caused by “BP’s culture of privileging profit over prudence.” (WSJ article)   Are there really a few simple, direct causes of this accident or is this an example of a highly complex system failure? Where does culpability for culture lie?  Stay tuned.

* U.S. District Court Eastern District of Louisiana, “Superseding Indictment for Involuntary Manslaughter, Seaman's Manslaughter and Clean Water Act: United States of America v. Robert Kaluza and Donald Vidrine,” Criminal No. 12-265.

** T. Fowler and R. Gold, “Engineers Deny Charges in BP Spill,” Wall Street Journal online (Nov. 18, 2012).

Thursday, November 1, 2012

Practice Makes Perfect

In this post we call attention to a recent article from The Wall Street Journal* that highlights an aspect of safety culture “learning” that may not be appreciated with approaches currently in vogue in the nuclear industry.  The gist of the article is that, just as practice is useful in mastering complex, physically challenging activities, it may also have value in honing the skills inherent in complex socio-technical issues.

“Research has established that fast, simple feedback is almost always more effective at shaping behavior than is a more comprehensive response well after the fact. Better to whisper "Please use a more formal tone with clients, Steven" right away than to lecture Steven at length on the wherefores and whys the next morning.”

Our sense is current efforts to instill safety culture norms and values tend toward after-the-fact lectures and “death by PowerPoint” approaches.  As the article correctly points out, it is “shaping behavior” that should be the goal and is something that benefits from feedback, and “An explicit request can normalize the idea of ‘using’ rather than passively "taking" feedback.”

It’s not a long article so we hope readers will just go ahead and click on the link below.

*  Lemov, D., “Practice Makes Perfect—And Not Just for Jocks and Musicians,” Wall Street Journal online (Oct. 26, 2012).

Monday, October 29, 2012

Nuclear Safety Culture Research

This is a subject that has been on our minds for some time.  Many readers may have eagerly jumped to this post to learn about the latest on research into nuclear safety culture (NSC) issues.  Sorry, you will be disappointed just as we were.  The painful and frankly inexplicable conclusion is that there is virtually no research in this area.  How come?

There is the oft-quoted 2002 comment by then ACRS Chairman, Dr. George Apostolakis:

"For the last 20 to 25 years this agency [the NRC] has started research projects on organizational-managerial issues that were abruptly and rudely stopped because, if you do that, the argument goes, regulations follow. So we don't understand these issues because we never really studied them."*

A principal focus of this blog has been to bring to the attention of our readers relevant information from academic and research sources.  We cover a wide range of topics where we see a connection to nuclear safety culture.  Thus we continually monitor additions to the science of NSC through papers, presentations, books, etc.  In doing so we have come to realize, there is and has been very little relevant research specifically addressing nuclear safety culture.  Even a search of secondary sources; i.e., the references contained in primary research documents, indicates a near vacuum of NSC-specific research.  This is in contrast to the oil and chemical industries and the U.S. manned space program.  In an August 2, 2010 post we described research by  Dr. Stian Antonsen of the Norwegian University of Science and Technology on “..whether it is possible to ‘predict’ if an organization is prone to having major accidents on the basis of safety culture assessments” [short answer: No].

Returning to the September 2012 DOE Nuclear Safety Workshop (see our Oct. 8, 2012 post), where nuclear safety culture was a major agenda item, we observe the only reference in all the presentations to actual research was from the results of an academic study of 17 offshore platform accidents to identify “cultural causal factors”. (See Mark Griffon’s presentation, slide 17.)

With regard to the manned space program, recall the ambitious MIT study to develop a safety culture simulation model for NASA and various independent studies, perhaps most notably
Diane Vaughan's The Challenger Launch Decision.  We have posted on each of these.

One study we did locate that is on topic is an empirical analysis of the use of safety culture surveys in the Millstone engineering organization performed by Professor John Carroll of MIT.  He found that “their [surveys'] use for assessing and measuring safety problematic…”**  It strikes us as curious that the nuclear industry which has so strongly embraced culture surveys hasn’t followed that with basic research to establish the legitimacy and limits of their application.

To further test the waters for applicable research we reviewed the research plans for major nuclear organizations.  The NRC Strategic Plan Fiscal Years 2008-2013 (Updated 2012)*** cites two goals in this area, neither of which address substantive nuclear safety culture issues:

Promote awareness of the importance of a strong safety culture and individual accountability of those engaged in regulated activities. (p.9)

Ensure dissemination of the Safety Culture Policy Statement to all of the regulated community. [Supports Safety Implementation Strategy 7] (p.12)

DOE’s 2010 Nuclear Energy Research and Development Roadmap identifies the following “major challenges”:

- Aging and degradation of system structures and components, such as reactor core internals, reactor pressure vessels, concrete, buried pipes, and cables.
- Fuel reliability and performance issues.
- Obsolete analog instrumentation and control technologies.
- Design and safety analysis tools based on 1980s vintage knowledge bases and computational capabilities.*

The goals of these nuclear research programs speak for themselves.  Now compare to the following from the Chemical Safety Board Strategic Plan:

“Safety Culture continues to be cited in investigations across many industry sectors including the Presidential Commission Report on Deepwater Horizon, the Fukushima Daiichi incident, and the Defense Nuclear Facilities Safety Board’s recommendation for the Hanford Waste Treatment and Immobilization Plant. A potential study would consider issues such as how safety culture is defined, what makes an effective safety culture, and how to evaluate safety culture.”

And this from the VTT Technical Research Centre of Finland, the largest energy sector research unit in Northern Europe.

Man, Organisation and Society – in this area, safety management in a networked operating environment, and the practices for developing nuclear safety competence and safety culture have a key role in VTT's research. The nuclear specific know-how and the combination of competencies in behavioural sciences and fields of technology made possible by VTT's multidisciplinary expertise are crucial to supporting the safe use of nuclear power.#

We invite our readers to bring to our attention any NSC-specific research of which they may be aware.

*  J. Mangels and J. Funk, “Davis-Besse workers' repair job hardest yet,” Cleveland Plain Dealer (Dec. 29, 2002).  Retrieved Oct. 29, 2012.

**    J.S. Carroll, "Safety Culture as an Ongoing Process: Culture Surveys as Opportunities for Inquiry and Change," work paper (undated) p.23, later published in Work and Stress 12 (1998), pp. 272-284.

***  NRC "Strategic Plan: Fiscal Years 2008–2013" (Feb. 2012) published as NUREG-1614, Vol. 5.

****  DOE, "Nuclear Energy Research and Development Roadmap" (April 2010) pp. 17-18. 

*****  CSB, "2012-2016 US Chemical Safety Board Strategic Plan" (June 2012) p. 17.
#  “Nuclear power plant safety research at VTT,” Public Service Review: European Science and Technology 15 (July 13, 2012).  Retrieved Oct. 29, 2012.

Friday, October 26, 2012

Communicating Change

One of our readers suggested we look at Communicating Change* by T.J. and Sandar Larkin.  The Larkins are consultants so I was somewhat skeptical of finding any value for safety culture but they have significant experience and cite enough third-party references (think: typical Wikipedia item) to give the book some credibility. 

The book presents three principles for effectively communicating change, i.e., delivering a top-down message that ultimately results in better performance or acceptance of necessary innovations, workplace disruptions or future unknowns.

Transmit the message through the first-line supervisors.  They will be the ones who have to explain the message and implement the changes on a day-to-day basis after the executives board their plane and leave.  Senior management initiatives to communicate directly with workers undermines supervisors’ influence.

Communicate face-to-face.  Do not rely on newsletters, videos, mass e-mail and other one-way communication techniques; the message is too easily ignored or misunderstood.  Face-to-face, from the boss, may be even more important in the age of social media where people can be awash in a sea of (often conflicting) information.

Make changes relevant to each work area, i.e., give the supervisor the information, training and tools necessary to explain exactly what will change for the local work group, e.g., different performance standards, methods, equipment, etc.

That’s it, although the book goes on for almost 250 pages to justify the three key principles and explain how they might be implemented.  (The book is full of examples and how-to instructions.)

Initially I thought this approach was too simplistic, i.e., it wouldn’t help anyone facing the challenge of trying to change safety-related behavior.  But simple can cut through the clutter of well-meaning but complicated change programs, one size fits all media and training, and repetitive assessments.

This book is not the complete answer but it does provide a change agent with some perspective on how one might go about getting the individual contributors (trade, technical or professional) at the end of the food chain to understand, respond to and eventually internalize required behavioral changes. 

Please contact us if you have a suggestion for a resource that you’d like us to review.

 *  T. Larkin and S. Larkin, Communicating Change: Winning Employee Support for New Business Goals (New York: McGraw-Hill, 1994).

Wednesday, October 17, 2012

NRC Non-Regulation of Safety Culture: Third Quarter Update

On March 17 we published a post on NRC safety culture (SC) related activities with individual licensees since the SC policy statement was issued in June, 2011.  On July 3, we published an update for second quarter 2012 activities.  This post highlights selected NRC actions during the third quarter, July through September 2012.

Our earlier posts mentioned Browns Ferry, Fort Calhoun and Palisades as plants where the NRC was undertaking significant SC-related activities.  It looks like none of those plants has resolved its SC issues and, at the current rate of progress,
I’m sure we’ll be reporting on all of them for quite awhile.

Browns Ferry

As we reported earlier, this plant’s SC problems have existed for years.  On August 23, TVA management submitted its Integrated Improvement Plan Summary* to address NRC inspection findings that have landed the plant in column 4 (next to worst) of the NRC’s Action Matrix.  TVA’s analysis of its SC and operational performance problems included an independent SC assessment.  TVA’s overall analysis identified fifteen “fundamental problems” and two bonus issues; for SC improvement efforts, the problems and issues were organized into five focus areas: Accountability, Operational Decision Making (Risk Management), Equipment Reliability, Fire Risk Reduction and the Corrective Action Program (CAP).

The NRC published its mid-cycle review of Browns Ferry on September 4.  In the area of SC, the report noted the NRC had “requested that [the Substantive Cross-Cutting Issue in the CAP] be addressed during your third party safety culture assessment which will be reviewed as part of the Independent NRC Safety Culture Assessment per IP 95003. . . .”**

Fort Calhoun

SC must be addressed to the NRC’s satisfaction prior to plant restart.   The Omaha Public Power District (OPPD) published its Integrated Performance Improvement Plan on July 9.***  The plan includes an independent safety culture assessment to be performed by an organization “that is nationally recognized for successful performance of behavior-anchored nuclear safety culture assessments.” (p. 163)  Subsequent action items will focus on communicating SC principles, assessment results, SC improvement processes and SC information.

The NRC and OPPD met on September 11, 2012 to discuss NRC issues and oversight activities, and OPPD’s performance improvement plan, ongoing work and CAP updates.  OPPD reported that a third-party SC assessment had been completed and corrective actions were being implemented.****


The NRC continues to express its concerns over Palisades’ SC.  The best example is NRC’s August 30 letter***** requesting a laundry list of information related to Palisades’ independent SC assessment and management's reaction to same, including corrective actions, interim actions in place or planned to mitigate the effects of the SC weaknesses, compliance issues with NRC regulatory requirements or commitments, and the assessment of the SC at Entergy’s corporate offices. (p. 5)

The NRC held a public meeting with Palisades on September 12, 2012 to discuss the plant’s safety culture.  Plant management’s slides are available in ADAMS (ML12255A042).  We won’t review them in detail here but management's Safety Culture Action Plan includes the usual initiatives for addressing identified SC issues (including communication, training, CAP improvement and backlog reduction) and a new buzz phrase, Wildly Important Goals.

Other Plants

NRC supplemental inspections can require licensees to assess “whether any safety culture component caused or significantly contributed to” some performance issue.#  NRC inspection reports note the extent and adequacy of the licensee’s assessment, often performed as part of a root cause analysis.  Plants that had such requirements laid on them or had SC contributions noted in inspection reports during the third quarter included Brunswick, Hope Creek, Limerick, Perry, Salem, Waterford and Wolf Creek.

One other specific SC action arose from the NRC’s alternative dispute resolution (ADR) process at Entergy’s James A. FitzPatrick plant.  As part of an NRC Confirmatory Order following ADR, Entergy was told to add a commitment to maintain the SC monitoring processes at Entergy’s nine commercial nuclear power plants.##

The Bottom Line

None of this is a surprise.  Even the new Chairman tells it like it is: “In the United States, we have . . . incorporated a safety culture assessment into our oversight program . . . . “###  What is not a surprise is that particular statement was not included in the NRC’s press release publicizing the Chairman’s comments.  Isn’t “assessment” part of “regulation”?

Given the attention we pay to the issue of regulating SC, one may infer that we object to it.  We don’t.  What we object to is the back-door approach currently being used and the NRC’s continued application of the Big Lie technique to claim that they aren’t regulating SC.

*  P.D. Swafford (TVA) to NRC, “Integrated Improvement Plan Summary” (Aug. 23, 2012)  ADAMS ML12240A106.  TVA has referred to this plan in various presentations at NRC public and Commission meetings.

**  V.M. McCree (NRC) to J.W. Shea (TVA), “Mid Cycle Assessment Letter for Browns Ferry Nuclear Plant Units 1, 2, and 3” (Sept. 4, 2012)  ADAMS ML12248A296.

***  D.J. Bannister (OPPD) to NRC, “Fort Calhoun Station Integrated Performance Improvement Plan Rev. 3” (July 9, 2012)  ADAMS ML12192A204.

**** NRC, “09/11/2012 Meeting Summary of with Omaha Public Power District” (Sept. 25, 2012)  ADAMS ML12269A224.

*****  J.B. Giessner (NRC) to A. Vitale (Entergy), “Palisades Nuclear Plant – Notification of NRC Supplemental Inspection . . . and Request for Information” (Aug. 30, 2012)  ADAMS ML12243A409.

#  The scope of NRC Inspection Procedure 95001 includes “Review licensee’s evaluation of root and contributing causes. . . ,” which may include SC; IP 95002’s scope includes “Determine if safety culture components caused or significantly contributed to risk significant performance issues” and IP 95003’s scope includes “Evaluate the licensee’s third-party safety culture assessment and conduct a graded assessment of the licensee’s safety culture based on evaluation results.”  See IMC 2515 App B, "Supplemental Inspection Program" (Aug. 18, 2011)  ADAMS ML111870266.

##  M. Gray (NRC) to M.J. Colomb (Entergy), “James A. FitzPatrick Nuclear Power Plant - NRC Integrated Inspection Report 05000333/2012003” (Aug. 7, 2012)  ADAMS ML12220A278.

###  A.M. Macfarlane, “Assessing Progress in Worldwide Nuclear Safety,” remarks to International Nuclear Safety Group Forum, IAEA, Vienna, Austria (Sept. 17, 2012), p. 3 ADAMS ML12261A373; NRC Press Release No. 12-102, “NRC Chairman Says Safety Culture Critical to Improving Safety; Notes Fukushima Progress in United States” (Sept. 17, 2012) ADAMS ML12261A391.

Monday, October 8, 2012

DOE Nuclear Safety Workshop

The DOE held a Nuclear Safety Workshop on September 19-20, 2012.  Safety culture (SC) was the topic at two of the technical breakout sessions, one with outside (non-DOE) presenters and the other with DOE-related presenters.  Here’s our take on the outsiders’ presentations.

Chemical Safety Board (CSB)

This presentation* introduced the CSB and its mission and methods.  The CSB investigates chemical accidents and makes recommendations to prevent recurrences.  It has no regulatory authority. 

Its investigations focus on improving safety, not assigning blame.  The CSB analyzes systemic factors that may have contributed to an accident and recognizes that “Addressing the immediate cause only prevents that exact accident from occurring again.” (p. 5) 

The agency analyzes how safety systems work in real life and “why conditions or decisions leading to an accident were seen as normal, rational, or acceptable prior to the accident.” (p. 6)  They consider organizational and social causes, including “safety culture, organizational structure, cost pressures, regulatory gaps and ineffective enforcement, and performance agreements or bonus structures.” (ibid.)

The presentation included examples of findings from CSB investigations into the BP Texas City and Deepwater Horizon incidents.  The CSB’s SC model is adapted from the Schein construct.  What’s interesting is their set of artifacts includes many “soft” items such as complacency, normalization of deviance, management commitment to safety, work pressure, and tolerance of inadequate systems.

This is a brief and informative presentation, and well worth a look.  Perhaps because the CSB is unencumbered by regulatory protocol, it seems freer to go where the evidence leads it when investigating incidents.  We are impressed by their approach.

The NRC presentation** reviewed the basics of the Reactor Oversight Process (ROP) and then drilled down into how potential SC issues are identified and addressed.  Within the ROP, “. . . a safety culture aspect is assigned if it is the most significant contributor to an inspection finding.” (p.12)  After such a finding, the NRC may perform an SC assessment (per IP 95002) or request the licensee to perform one, which the NRC then reviews (per IP 95003).

This presentation is bureaucratic but provides a useful road map.  Looking at the overall approach, it is even more disingenuous for the NRC to claim that it doesn’t regulate SC.


There was nothing new here.  This infomercial for IAEA*** covered the basic history of SC and reviewed contents of related IAEA documents, including laundry lists of desired organizational attributes.  The three-factor IAEA SC figure presented is basically the Schein model, with different labels.  The components of a correct culture change initiative are equally recognizable: communication, continuous improvement, trust, respect, etc.

The presentation had one repeatable quote: “Culture can be seen as something we can influence, rather than something we can control” (p. 10)


SC conferences and workshops are often worthless but sometimes one does learn things.  In this case, the CSB presentation was refreshingly complete and the NRC presentation was perhaps more revealing than the presenter intended.

*  M.A. Griffon, U.S. Chemical Safety and Hazards Investigation Board, “CSB Investigations and Safety Culture,” presented at the DOE Nuclear Safety Workshop (Sept. 19, 2012). 

**  U. Shoop, NRC, “Safety Culture in the U.S. Nuclear Regulatory Commission’s Reactor Oversight Process,” presented at the DOE Nuclear Safety Workshop (Sept. 19, 2012).

***  M. Haage, IAEA, “What is Safety Culture & Why is it Important?” presented at the DOE Nuclear Safety Workshop (Sept. 19, 2012).

Friday, October 5, 2012

The Corporate Culture Survival Guide by Edgar Shein

Our September 21, 2012 post introduced a few key elements of Prof. Edgar Schein’s “mental model” of organizational culture.  Our focus in that post was to decry how Schein’s basic construct of culture had been adopted by the nuclear industry but then twisted to fit company and regulatory desires for simple-minded mechanisms for assessing culture and cultural interventions.

In this post, we want to expand on Schein’s model of what culture is, how it can be assessed, and how its evolution can be influenced by management initiatives.  Where appropriate, we will provide our perspective based on our beliefs and experience.  All the quotes below come from Schein’s The Corporate Culture Survival Guide.*

What is Culture?

Schein’s familiar model shows three levels of culture: artifacts, espoused values and underlying assumptions.  In his view, the real culture is the bottom level: “Culture is the shared tacit assumptions of a group that have been learned through coping with external tasks and dealing with internal relationships.” (p. 217)  The strength of an organization’s culture is a function of the intensity of shared experiences and the relative success the organization has achieved.  “Culture . . . influences how you think and feel as well as how you act.” (p. 75)  Culture is thus a product of social learning. 

Our view does not conflict with Schein’s.  In our systems approach, culture is a variable that provides context for, but does not solely determine, organizational and individual decisions. 

How can Culture be Assessed?


“You cannot use a survey to assess culture.” (p. 219)  The specific weaknesses of surveys are discussed elsewhere (pp. 78-80) but his bottom line is good enough for us.  We agree completely.


Individual interviews can be used when interviewees would be inhibited in a group setting but Schein tries to avoid them in favor of group interviews because the latter are more likely to correctly identify the true underlying assumptions. 

In contrast, the NEI and IAEA safety culture evaluation protocols use interviews extensively, and we’ve commented on them here and here

Group discussion 

Schein’s recommended method for deciphering a company’s culture is a facilitated group exercise that attempts to identify the deeper (real) assumptions that drive the creation of artifacts by looking at conflicts between the artifacts and the espoused values. (pp. 82-87)   

How can Culture be Influenced?

In Schein’s view, culture cannot be directly controlled but managers can influence and evolve a culture.  In fact, “Managing cultural evolution is one of the primary tasks of leadership.” (p. 219)

His basic model for cultural change is creating the motivation to change, followed by learning and then internalizing new concepts, meanings and standards. (p. 106).  This can be a challenging effort; resistance to change is widespread, especially if the organization has been successful in the past.  Implementing change involves motivating people to change by increasing their survival anxiety or guilt; then promoting new ways of thinking, which can lead to learning anxiety (fear of loss or failure).  Learning anxiety can be ameliorated by increasing the learner’s psychological safety by using multiple steps, including training, role models and consistent systems and structures.  Our promotion of simulation is based on our belief that simulation can provide a platform for learners to practice new behaviors in a controlled and forgiving setting.

If time is of the essence or major transformational change is necessary, then the situation requires the removal and replacement of the key cultural carriers.  Replacement of management team members has often occurred at nuclear plants to address perceived performance/culture issues.
Schein says employees can be coerced into behaving differently but they will only internalize the new ways of doing business if the new behavior leads to better outcomes.  That may be true but we tend toward a more pragmatic approach and agree with Commissioner Apostolakis when he said: “. . . we really care about what people do and maybe not why they do it . . . .”

Bottom Line
Prof. Schein has provided a powerful model for visualizing organizational culture and we applaud his work.  Our own modeling efforts incorporate many of his factors, although not always in the same words.  In addition, we consider other factors that influence organizational behavior and feed back into culture, e.g., the priorities and resources provided by a corporate parent.

*  E.H. Schein, The Corporate Culture Survival Guide, new and revised ed. (San Francisco: Jossey-Bass, 2009).  

Friday, September 21, 2012

SafetyMatters and the Schein Model of Culture

A reader recently asked: “Do you subscribe to Edgar Schein's culture model?”  The short-form answer is a qualified “Yes.”  Prof. Schein has developed significant and widely accepted insights into the structure of organizational culture.  In its simplest form, his model of culture has three levels: the organization’s (usually invisible) underlying beliefs and assumptions, its espoused values, and its visible artifacts such as behavior and performance.  He describes the responsibility of management, through its leadership, to articulate the espoused values with policies and strategies and thus shape culture to align with management’s vision for the organization.  Schein’s is a useful mental model for conceptualizing culture and management responsibilities.*     

However, we have issues with the way some people have applied his work to safety culture.  For starters, there is the apparent belief that these levels are related in a linear fashion, more particularly, that management by promulgating and reinforcing the correct values can influence the underlying beliefs, and together they will guide the organization to deliver the desired behaviors, i.e., the target level of safety performance.  This kind of thinking has problems.

First, it’s too simplistic.  Safety performance doesn’t arise only because of management’s espoused values and what the rest of the organization supposedly believes.  As discussed in many of our posts, we see a much more complex, multidimensional and interactive system that yields outcomes which reflect, in greater or lesser terms, desired levels of safety.  We have suggested that it is the totality of such outcomes that is representative of the safety culture in fact.** 

Second, it leads to attempts to measure and influence safety culture that are often ineffective and even misleading.  We wonder whether the heavy emphasis on values and leadership attitudes and behaviors - or traits - that the Schein model encourages, creates a form versus substance trap.  This emphasis carries over to safety culture surveys - currently the linchpin for identifying and “correcting” deficient safety culture -  and even doubles down by measuring the perception of attitudes and behaviors.  While attitudes and behaviors may in fact have a beneficial effect on the organizational environment in which people perform - we view them as good habits - we are not convinced they are the only determinants of the actions, decisions and choices made by the organization.  Is it possible that this approach creates an organization more concerned with how it looks and how it is perceived than with what it does?   If everyone is checking their safety likeness in the cultural mirror might this distract from focusing on how and why actual safety-related decisions are being made?

We think there is good support for our skepticism.  For every significant safety event in recent years - the BP refinery fire, the Massey coal mine explosion, the shuttle disasters, the Deepwater oil rig explosion, and the many instances of safety culture issues at nuclear plants - the organization and senior management had been espousing as their belief that “safety is the highest priority.”  Clearly that was more illusion than reality.

To give a final upward thrust to the apple cart, we don’t think that the current focus on nuclear safety culture is primarily about culture.  Rather we see “safety culture” more as a proxy for management’s safety performance - and perhaps a back door for the NRC to regulate while disclaiming same.*** 

*  We have mentioned Prof. Schein in several prior blog posts: June 26, 2012, December 8, 2011, August 11, 2010, March 29, 2010, and August 17, 2009.

**  This past year we have posted several times on decisions as one type of visible result (artifact) of the many variables that influence organizational behavior.  In addition, please revisit two of Prof. Perin’s case studies, summarized here.  They describe well-intentioned people, who probably would score well on a safety culture survey, who made plant problems much worse through a series of decisions that had many more influences than management’s entreaties and staff’s underlying beliefs.

***  Back in 2006, the NRC staff proposed to enhance the ROP to more fully address safety culture, saying that “Safety culture includes . . . features that are not readily visible such as basic assumptions and beliefs of both managers and individuals, which may be at the root cause of repetitive and far-reaching safety performance problems.”  It wouldn’t surprise us if that’s an underlying assumption at the agency.  See L.A. Reyes to the Commissioners, SECY-06-0122 “Policy Issue Information: Safety Culture Initiative Activities to Enhance the Reactor Oversight Process and Outcomes of the Initiatives” (May 24, 2006) p. 7 ADAMS ML061320282.  

Tuesday, September 4, 2012

More on Cynefin

Bob Cudlin recently posted on the work of David Snowden, a decision theorist and originator of the Cynefin decision construct.  Snowden’s Cognitive Edge website has a lot of information related to Cynefin, perhaps too much to swallow at once.  For those who want an introduction to the concepts, focusing on their implications for decision-making, we suggest a paper “Cynefin: repeatability, science and values”* by Prof. Simon French.

In brief, the Cynefin model divides decision contexts into four spaces: Known (or Simple), Knowable (or Complicated), Complex and Chaotic.  Knowledge about cause-and-effect relationships (and thus, appropriate decision making approaches) differs for each space.  In the Simple space, cause-and-effect is known and rules or processes can be established for decision makers; “best” practices are possible.  In the Complicated space, cause-and-effect is generally known but individual decisions require additional data and analysis, perhaps with probabilistic attributes; different practices may achieve equal results.  In the Complex space, cause-and-effect may only be identified after an event takes place so decision making must work on broad, flexible strategies that can be adjusted as a situation evolves; new practices emerge.  In the Chaotic space, there are no applicable analysis methods so decision makers must try things, see what happens and attempt to stabilize the situation; a novel (one-off) practice obtains.   

The model in the 2008 French paper is not in complete accord with the Cynefin model currently described by Snowden but French’s description of the underlying considerations for decision makers remains useful.  French’s paper also relates Cynefin to the views of other academics in the field of decision making.  

For an overview of Cynefin in Snowden’s own words, check out “The Cynefin Framework” on YouTube.  There he discusses a fifth space, Disorder, which is basically where a decision maker starts when confronted with a new decision situation.  Importantly, a decision maker will instinctively try to frame the decision in the Cynefin decision space most familiar to the decision maker based on personal history, professional experience, values and preference for action. 

In addition, Snowden describes the boundary between the Simple and Chaotic as the “complacent zone,” a potentially dangerous place.  In the Simple space, the world appears well-understood but as near-misses and low-signal events are ignored, the system can drift toward the boundary and slip into the Chaotic space where a crisis can arise and decision makers risk being overwhelmed.

Both decision maker bias and complacency present challenges to maintaining a strong safety culture.  The former can lead to faulty analysis of problems, forcing complex issues with multiple interactive causes through a one-size-fits-all solution protocol.  The latter can lead to disasters, great and small.  We have posted many times on the dangers of complacency.  To access those posts, click “complacency” in the Labels box.

*  S. French, “Cynefin: repeatability, science and values,” Newsletter of the European Working Group “Multiple Criteria Decision Aiding,” series 3, no. 17 (Spring 2008).  Thanks to Bill Mullins for bringing this paper to our attention. 

Thursday, August 30, 2012

Failure to Learn

In this post we call your attention to a current research paper* and Wall Street Journal summary article** that sheds some light on how people make decisions to protect against risk.  The specific subject of the research involves response to imminent risk of house damage due to hurricanes.  As the author of the paper states, “The purpose of this paper is to attempt to resolve the question of whether there are, in fact, inherent limits to our ability to learn from experience about the value of protection against low-probability, high-consequence, events.” (p.3)  Also of interest is how the researchers used several simulations to gain insight and quantify how the decisions compared to optimal risk mitigation.

Are these results directly applicable to nuclear safety decisions?  We think not.  But they are far from irrelevant.  They illustrate the value of careful and thoughtful research into the how and why of decisions, the impact of the decision environment and the opportunities for learning to produce better decisions.  It also raises the question, Where is the nuclear industry on this subject?  Nuclear managers are making routinely what are probably the most safety significant decisions of any industry.  But how good are these decisions, and what determines their decision quality?  The industry might contend that the emphasis on safety culture (meaning values and traits) is the sine qua non for assuring decisions that adequately reflect safety.  Bad decision?  Must have been bad culture.  Reiterate culture, assume better decisions to follow. Is this right or is safety culture the wrong blanket or just too small a blanket to try to cover a decision process evolving from a complex adaptive system? 

The basic construct for the first simulation was a contest among participants (college students) with the potential to earn a small cash bonus based on achieving certain performance results.  Each participant was made the owner of a house in a coastal area subject to hurricane intrusion.  During the simulation animation, a series of hurricanes would materialize in the ocean and approach land.  The position, track and strength of the hurricane were continuously updated.  Prior to landfall participants had the choice of purchasing protection against damage for that specific storm, either partial or full protection.  The objective was to maximize total net asset; i.e., the value of the house, less any uncompensated damage and less the cost of any purchased protection.

While the first simulation focused on recurrent short term mitigation decisions, in the second simulation participants had the option to purchase protection that would last at least for the full season but had to purchased prior to a storm occurring.  (A comprehensive description of the simulation and test data are provided in the referenced paper.)

The results indicated that participants significantly under-protected their homes leading to actual losses higher than a “rational” approach to purchasing protection.  While part of the losses was due to purchasing protection unnecessarily, most was due to under protection.  The main driver, according to the researchers, appeared to be that participants over relied on their most recent experience instead of an objective assessment of current risk.  In other words, if in a prior hurricane they experienced no damage, either due to the track of the hurricane or because they had purchased protection, they were less inclined to purchase protection for the next hurricane. 

The simulations reveal limitations in the ability to achieve improved decisions in what was, in essence, a trial and error environment.  Feedback occurred after each storm, but participants did not necessarily use the feedback in an optimal manner “due to a tendency to excessively focus on the immediate disutility of cost outlays” (p.10)  In any event it is clear that the nuclear safety decision making environment is “not ideal for learning—…[since] feedback is rare and noisy…” (p.5)  In fact most feedback in nuclear operations might appear to be affirming since rarely do decisions to take short term risks result in bad outcomes.  It is an environment susceptible to complacency more than learning.

The author concludes with a final question as to whether non-optimal decision making, such as observed in the simulations, can be overcome.  He concludes, “This is may be a difficult since the psychological mechanisms that lead to the biases may be hard-wired; as long as we remain present-focused, prone to chasing short-term rewards and avoiding short term punishment, it is unlikely that individuals and institutions will learn to undertake optimal levels of protective investment by experience alone. The key, therefore, is introducing decision architectures that allow individuals to overcome these biases through, for example, creative use of defaults…” (pp. 30-31)

*  R.J. Meyer, “Failing to Learn from Experience about Catastrophes: The Case of Hurricane Preparedness,” The Wharton School, University of Pennsylvania Working Paper 2012-05 (March 2012).

** C. Shea, “Failing to Learn From Hurricane Experience, Again and Again,” Wall Street Journal (Aug. 17, 2012).

Tuesday, August 28, 2012

Confusion of Properties and Qualities

Dave Snowden
In this post we highlight a provocative, and we believe, accurate criticism of the approach taken by many management scientists in focusing on behaviors as the determinant of desired outcomes.  The source is Dave Snowden, a Welsh lecturer, consultant and researcher in the field of knowledge management.  For those of you interested in finding out more about him, the website for Cognitive Edge, founded by Snowden, contains an abundant amount of accessible content.

Snowden is a proponent of applying complexity science to inform managers’ decision making and actions.  He is perhaps best known for developing the Cynefin framework which is designed to help managers understand their operational context - based on four archetypes: simple, complicated, complex and chaotic. In considering the archetypes one can see how various aspects of nuclear operations might fit within the simple or complicated frameworks; frameworks where tools such as best practices and root cause analysis are applicable.  But one can also see the limitations of these frameworks in more complex situations, particularly those involving nuanced safety decisions which are at the heart of nuclear safety culture.  Snowden describes “complex adaptive systems” as ones where the system and its participants evolve together through ongoing interaction and influence, and system behavior is “emergent” from that process.  Perhaps most provocatively for nuclear managers is his contention that CDA systems are “non-causal” in nature, meaning one shouldn’t think in terms of linear cause and effect and shouldn’t expect that root cause analysis will provide the needed insight into system failures.

With all that said, we want to focus on a quote from one of Snowden’s lectures in 2008 “Complexity Applied to Systems”.*  In the lecture at approximately the 15:00 minute mark, he comments on a “fundamental error of logic” he calls “confusion of properties and qualities”.  He says:

“...all of management science, they observe the behaviors of people who have desirable properties, then try to achieve those desirable properties by replicating the behaviors”.

By way of a pithy illustration Snowden says, “...if I go to France and the first ten people I see are wearing glasses, I shouldn’t conclude that all Frenchmen wear glasses.  And I certainly shouldn’t conclude if I put on glasses, I will become French.”

For us Snowden’s observation generated an immediate connection to the approach being implemented around the nuclear enterprise.  Think about the common definitions of safety culture adopted by the NRC and industry.  The NRC definition specifies “... the core values and behaviors…” and “Experience has shown that certain personal and organizational traits are present in a positive safety culture. A trait, in this case, is a pattern of thinking, feeling, and behaving that emphasizes safety, particularly in goal conflict situations, e.g., production, schedule, and the cost of the effort versus safety.”**

The INPO definition defines safety culture as “An organization's values and behaviors – modeled by its leaders and internalized by its members…”***

In keeping with these definitions the NRC and industry rely heavily on the results of safety culture surveys to ascertain areas in need of improvement.  These surveys overwhelmingly focus on whether nuclear personnel are “modeling” the definitional traits, values and behaviors.  This seems to fall squarely in the realm described by Snowden of looking to replicate behaviors in hopes of achieving the desired culture and results.  Most often, identified deficiencies are subject to retraining to reinforce the desired safety culture traits.  But what seems to be lacking is a determination of why the traits were not exhibited in the first place.  Followup surveys may be conducted periodically, again to measure compliance with traits.  This recipe is considered sufficient until the next time there are suspect decisions or actions by the licensee. 

Bottom Line

The nuclear enterprise - NRC and industry - appear to be locked into a simplistic and linear view of safety culture.  Values and traits produce desired behaviors; desired behaviors produce appropriate safety management.  Bad results?  Go back to values and traits and retrain.  Have management reiterate that safety is their highest priority.  Put up more posters. 

But what if Snowden’s concept of complex adaptive systems is really an applicable model, and the safety management system is a much more complicated, continuously, self-evolving process?  It is a question well worth pondering - and may have far more impact than much of the hardware centric issues currently being pursued.

Footnote: Snowden is an immensely informative and entertaining lecturer and a large number of his lectures are available via podcasts on the Cognitive Edge website and through YouTube videos.  They could easily provide a stimulating input to safety culture training sessions.

*  Podcast available at 

**  NRC Safety Culture Policy Statement (June 14, 2011).

***  INPO Definition of Safety Culture (2004).