Tuesday, April 17, 2018

Nuclear Safety Culture: Insights from Principles by Ray Dalio

Book cover
Ray Dalio is the billionaire founder/builder of Bridgewater Associates, an investment management firm.  Principles* catalogs his policies, practices and lessons-learned for understanding reality and making decisions for achieving goals in that reality.  The book appears to cover every possible aspect of managerial and organizational behavior.  Our plan is to focus on two topics near and dear to us—decision making and culture—for ideas that could help strengthen nuclear safety culture (NSC).  We will then briefly summarize some of Dalio’s other thoughts on management.  Key concepts are shown in italics.

Decision Making

We’ll begin with Dalio’s mental model of reality.  Reality is a system of universal cause-effect relationships that repeat and evolve like a perpetual motion machine.  The system dynamic is driven by evolution (“the single greatest force in the universe” (p. 142)) which is the process of adaptation.

Because many situations repeat themselves, principles (policies or rules) advance the goal of making decisions in a systematic, repeatable way.  Any decision situation has two major steps: learning (obtaining and synthesizing data about the current situation) and deciding what to do.  Logic, reason and common sense are the primary decision making mechanisms, supported by applicable existing principles and tools, e.g., expected value calculations or evidence-based decision making tools.  The lessons learned from each decision situation can be incorporated into existing or new principles.  Practicing the principles develops good habits, i.e., automatic, reflexive behavior in the specified situations.  Ultimately, the principles can be converted into algorithms that can be computerized and used to support the human decision makers.

Believability weighting can be applied during the decision making process to obtain data or opinions about solutions.  Believable people can be anyone in the organization but are limited to those “who 1) have repeatedly and successfully accomplished the thing in question, and 2) . . . can logically explain the cause-effect relationships behind their conclusions.” (p. 371)  Believability weighting supplements and challenges responsible decision makers but does not overrule them.  Decision makers can also make use of thoughtful disagreement where they seek out brilliant people who disagree with them to gain a deeper understanding of decision situations.

The organization needs a process to get beyond disagreement.  After all discussion, the responsible party exercises his/her decision making authority.  Ultimately, those who disagree have to get on board (“get in sync”) and support the decision or leave the organization.

The two biggest barriers to good decision making are ego and blind spots.  Radical open-mindedness recognizes the search for what’s true and the best answer is more important than the need for any specific person, no matter their position in the organization, to be right.

Culture

Organizations and the individuals who populate them should also be viewed as machines.  Both are imperfect but capable of improving. The organization is a machine made up of culture and people that produces outcomes that provide feedback from which learning can occur.  Mistakes are natural but it is unacceptable to not learn from them.  Every problem is an opportunity to improve the machine.  

People are generally imperfect machines.  People are more emotional than logical.   They suffer from ego (subconscious drivers of thoughts) and blind spots (failure to see weaknesses in themselves).  They have different character attributes.  In short, people are all “wired” differently.  A strong culture with clear principles is needed to get and keep everyone in sync with each other and in pursuit of the organization’s goals.

Mutual adjustment takes place when people interact with culture.  Because people are different and the potential to change their wiring is low** it is imperative to select new employees who will embrace the existing culture.  If they can’t or won’t, or lack ability, they have to go.  Even with its stringent hiring practices, about a third of Bridgewater’s new hires are gone by the end of eighteen months.

Human relations are built on meaningful relationships, radical truth and tough love.  Meaningful relationships means people give more consideration to others than themselves and exhibit genuine caring for each other.  Radical truth means you are “transparent with your thoughts and open-mindedly accepting the feedback of others.” (p. 268)  Tough love recognizes that criticism is essential for improvement towards excellence; everyone in the organization is free to criticize any other member, no matter their position in the hierarchy.  People have an obligation to speak up if they disagree. 

“Great cultures bring problems and disagreements to the surface and solve them well . . .” (p. 299)  The culture should support a five-step management approach: Have clear goals, don’t tolerate problems, diagnose problems when they occur, design plans to correct the problems, and do what’s necessary to implement the plans, even if the decisions are unpopular.  The culture strives for excellence so it’s intolerant of folks who aren’t excellent and goal achievement is more important than pleasing others in the organization.

More on Management 


Dalio’s vision for Bridgewater is “an idea meritocracy in which meaningful work and meaningful relationships are the goals and radical truth and radical transparency are the ways of achieving them . . .” (p. 539)  An idea meritocracy is “a system that brings together smart, independent thinkers and has them productively disagree to come up with the best possible thinking and resolve their disagreements in a believability-weighted way . . .” (p. 308)  Radical truth means “not filtering one’s thoughts and one’s questions, especially the critical ones.” (ibid.)  Radical transparency means “giving most everyone the ability to see most everything.” (ibid.)

A person is a machine operating within a machine.  One must be one’s own machine designer and manager.  In managing people and oneself, take advantage of strengths and compensate for weaknesses via guardrails and soliciting help from others.  An example of a guardrail is assigning a team member whose strengths balance another member’s weaknesses.  People must learn from their own bad decisions so self-reflection after making a mistake is essential.  Managers must ascertain if mistakes are evidence of a weakness and whether compensatory action is required or, if the weakness is intolerable, termination.  Because values, abilities and skills are the drivers of behavior management should have a full profile for each employee.

Governance is the system of checks and balances in an organization.  No one is above the system, including the founder-owner.  In other words, senior managers like Dalio can be subject to the same criticism as any other employee.

Leadership in the traditional sense (“I say, you do”) is not so important in an idea meritocracy because the optimal decisions arise from a group process.  Managers are seen as decision makers, system designers and shapers who can visualize a better future and then build it.   Leaders “must be willing to recruit individuals who are willing to do the work that success requires.” (p. 520)

Our Perspective

We recognize international investment management is way different from nuclear power management so some of Dalio’s principles can only be applied to the nuclear industry in a limited way, if at all.  One obvious example of a lack of fit is the area of risk management.  The investing environment is extremely competitive with players evolving rapidly and searching for any edge.  Timely bets (investments) must be made under conditions where the risk of failure is many orders of magnitude greater than what acceptable in the nuclear industry.  Other examples include the relentless, somewhat ruthless, pursuit of goals and a willingness to jettison people that is foreign to the utility world.

But we shouldn’t throw the baby out with the bath.  While Dalio’s approach may be too extreme for wholesale application in your environment it does provide a comparison (note we don’t say “standard”) for your organization’s performance.  Does your decision making process measure up to Dalio’s in terms of robustness, transparency and the pursuit of truth?  Does your culture really strive for excellence (and eliminate those who don’t share that vision) or is it an effort constrained by hierarchical, policy or political realities?

This is a long book but it’s easy to read and key points are repeated often.  Not all of it is novel; many of the principles are based on observations or techniques that have been around for awhile and should be familiar to you.  For example, ideas about how human minds work are drawn, in part, from Daniel Kahneman; an integrated hierarchy of goals looks like Management by Objectives; and a culture that doesn’t automatically punish people for making mistakes or tolerable errors sounds like a “just culture” albeit with some mandatory individual learning attached.

Bottom line: Give this book a quick look.  It can’t hurt and might help you get a clearer picture of how your own organization actually operates.



*  R. Dalio, Principles (New York: Simon & Schuster, 2017).  This book was recommended to us by a Safetymatters reader.  Please contact us if you have any material you would like us to review.

**  A person’s basic values and abilities are relatively fixed, although skills may be improved through training.

Friday, December 1, 2017

Nuclear Safety Culture: Focus on Decision Making



McKinsey Five Fifty cover
We have long held that decision making (DM) is a key artifact reflecting the state of a nuclear organization’s safety culture.

The McKinsey Quarterly (MQ) has packaged a trio of articles* on DM.  Their first purpose is identifying and countering the different biases that lead to sub-optimal, even disastrous decisions.  (When specific biases are widely spread in an organization, they are part of its culture.)  A second purpose is to describe the attributes of more fair, robust and effective DM processes.  The articles’ specific topics are (1) the behavioral science that underlies DM, (2) a method for categorizing and processing decisions and (3) a case study of a major utility that changed its decision culture. 

“The case for behavioral strategy” (MQ, March 2010)

This article covers the insights from psychology that can be used to fashion a robust DM process.  The authors evidence the need for process improvement by reporting their survey research results showing over 50 percent of the variability in decisional results (i.e., performance) was determined by the quality of the DM process while less than 10 percent was caused by the quality of the underlying analysis. 

There are plenty of cognitive biases that can affect human DM.  The authors discuss several of them and strategies for counteracting them, as summarized in the table below.


Type of bias
How to counteract
False pattern recognition (e.g., saliency (overweight recent or memorable events), confirmation, inaccurate analogies)
Require alternative explanations for the data in the analysis, articulate participants’ relevant experiences (which can reveal the basis for their biases), identify similar situations for comparative analysis.
Bias for action
Explicitly consider uncertainty in the input data and the possible outcomes.
Stability (anchoring to an initial value, loss aversion)
Establish stretch targets that can’t be achieved by business as usual.
Silo thinking
Involve a diverse group in the DM process and define specific decision criteria before discussions begin.
Social (conformance to group views)
Create genuine debate through a diverse set of decision makers, a climate of trust and depersonalized discussions.


The greatest problem arises from biases that create repeatable patterns that become undesirable cultural traits.  DM process designers must identify the types of biases that arise in their organization’s DM, and specify debiasing techniques that will work in their organization and embed them in formal DM procedures.

An attachment to the article identifies and defines 17 specific biases.  Much of the seminal research on DM biases was performed by Daniel Kahneman who received a Nobel prize for his efforts.  We have reviewed Prof. Kahneman’s work on Safetymatters; see our Nov. 4, 2011 and Dec. 18, 2013 posts or click on the Kahneman label. 

“Untangling your organization’s decision making” (MQ, June 2017)

While this article is aimed at complex, global organizations, there are lessons here for nuclear organizations (typically large bureaucracies) because all organizations have become victims of over-abundant communications, with too many meetings and low value e-mail threads distracting members from paying attention to making good decisions.

The authors posit four types of decisions an organization faces, plotted on a 2x2 matrix (the consultant’s best friend) with scope and impact (broad or narrow) on one axis and level of familiarity (infrequent or frequent) on the other.  A different DM approach is proposed for each quadrant. 

Big-bet decisions are infrequent and have broad impact.  Recommendations include (1) ensure there’s an executive sponsor, (2) break down the mega-decision into manageable parts for analysis (and reassemble them later), (3) use a standard DM approach for all the parts and (4) establish a mechanism to track effectiveness during decision implementation.

The authors observe that some decisions turn out to be “bet the company” ones without being recognized as such.  There are examples of this in the nuclear industry.  For details, see our June 18,2013 post on Kewaunee (had only an 8 year PPA), Crystal River (tried to cut through the containment using in-house expertise) and SONGs (installed replacement steam generators with an unproven design). 

Cross-cutting decisions are more frequent and have broad impact.  Some decisions at a nuclear power plant fall into this category.  They need to have the concurrence and support of the Big 3 stakeholders (Operations, Engineering and Maintenance).  Silo attitudes are an omnipresent threat to success in making these kinds of decisions.  The key is to get the stakeholders to agree on the main process steps and define them in a plain-English procedure that defines the calendar, handoffs and decisions.  Governing policy should establish the DM bodies and their authority, and define shared performance metrics to measure success. 

Delegated decisions are frequent and low-risk.  They can be effectively handled by an individual or working team, with limited input from others.  The authors note “The role-modeling of senior leaders is invaluable, but they may be reluctant” to delegate.  We agree.  In our experience, many nuclear managers were hesitant to delegate as many decisions as they could have to subordinates.  Their fear of being held accountable for a screw-up was just too great.  However, their goal should have been to delegate all decisions except those for which they alone had the capabilities and accountability.  Subordinates need appropriate training and explicit authority to make their decisions and they need to be held accountable by higher-level managers.  The organization needs to establish a clear policy defining when and how a decision should be elevated to a more senior decision maker. 

Ad hoc decisions are infrequent and low-risk; they were deliberately omitted from the article. 

“A case study in combating bias” (MQ, May 2017)

This is an interview with a senior executive of a German utility that invested €10 billion in conventional power projects, investments that failed when the political-economic environment evolved in a direction opposite to their assumptions.  In their postmortem, they realized they had succumbed to several cognitive biases, including status quo, confirmation, champion and sunflower.  The sunflower bias (groups aligning with their leaders) stretched far down the organizational hierarchy so lower-level analysts didn’t dare to suggest contrary assumptions or outcomes.

The article describes how the utility made changes to their DM practices to promote awareness of biases and implement debiasing techniques, e.g, one key element is officially designated “devil’s advocates” in DM groups.  Importantly, training emphasizes that biases are not some personal defect but “just there,” i.e., part of the culture.  The interviewee noted that the revised process is very time-intensive so it is utilized only for the most important decisions facing each user group. 

Our Perspective 

The McKinsey content describes executive level, strategic DM but many of the takeaways are equally applicable to decisions made at the individual, department and inter-department level, where a consistent approach is perhaps even more important in maintaining or improving organizational performance.

The McKinsey articles come in one of their Five Fifty packages, with a summary you can review in five minutes and the complete articles that may take fifty minutes total.  You should invest at least the smaller amount.


*  “Better Decisions,” McKinsey Quarterly Five Fifty.  Retrieved Nov. 28, 2017.

Tuesday, November 21, 2017

Any Lessons for Nuclear Safety Culture from VW’s Initiative to Improve Its Compliance Culture?

VW Logo (Source: Wikipedia)
The Wall Street Journal (WSJ) recently published an interview* with the head of the new compliance department in Volkswagen’s U.S. subsidiary.  The new executive outlined the department’s goals and immediate actions related to improving VW’s compliance culture.  They will all look familiar to you, including a new organization (headed by a former consultant) reporting directly to the CEO and with independent access to the board; mandatory compliance training; a new code of conduct; and developing a questioning attitude among employees.  One additional attribute deserves a brief expansion.  VW aims to improve employees’ decision making skills.  We’re not exactly sure what that means but if it includes providing more information about corporate policies and legal, social and regulatory expectations (in other words, the context of decisions) then we approve.

Our Perspective 


These interventions could be from a first generation nuclear safety culture (NSC) handbook on efforts to demonstrate management interest and action when a weak culture is recognized.  Such activities are necessary but definitely not sufficient to strengthen culture.  Some specific shortcomings follow.

First, the lack of reflection.  When asked about the causes of VW’s compliance failures, the executive said “I can’t speculate on the failures . . .”  Well, she should have had something to say on the matter, even party line bromides.  We’re left with the impression she doesn’t know, or care, about the specific and systemic causes of VW’s “Dieselgate” problems that are costing the company tens of billions of dollars.  After all, this interview was in the WSJ, available to millions of critical readers, not some trade rag.

Second, the trust issue.  VW wants employees who can be trusted by the organization, presumably to do “the right thing” as they go about their business.  That’s OK but it’s even more important to have senior managers who can be trusted to do the right thing.  This is especially relevant for VW because it’s pretty clear the cheating problems were tolerated, if not explicitly promoted, by senior management; in other words, there was a top-down issue in addition to lower-level employee malfeasance.

Next, the local nature of the announced interventions.  The new compliance department is for VW-USA only.  The Volkswagen Group of America includes one assembly plant, sales and maintenance support functions, test centers and VW’s consumer finance entity.  It’s probably safe to say that VW’s most important decisions regarding corporate practices and product engineering are made in Wolfsburg, Lower Saxony and not Herndon, Virginia.

Finally, the elephant in the room.  There is no mention of VW’s employee reward and recognition system or the senior management compensation program.  We have long argued that employees focus on actions that will secure their jobs (and perhaps lead to promotions) while senior managers focus on what they’re being paid to accomplish.  For the latter group in the nuclear industry, that’s usually production with safety as a should-do but with little, if any, money attached.  We don’t believe VW is significantly different.

Bottom line: If this WSJ interview is representative of the auto industry’s understanding of culture, then once again nuclear industry thought leaders have a more sophisticated and complete grasp of cultural dynamics and nuances.

We have commented before on the VW imbroglio.  See our Dec. 20, 2015 and May 31, 2016 posts or click on the VW label.


*B. DiPietro, “Working to Change Compliance Culture at Volkswagen,” Wall Street Journal (Nov. 16, 2017).

Monday, October 30, 2017

Nuclear Safety Culture Under Assault: DNFSB Chairman Proposes Eliminating the Board


DNFSB headquarters
The Center for Public Integrity (CPI) recently published a report* that disclosed a private letter** from Sean Sullivan, the Chairman of the Defense Nuclear Facilities Safety Board (DNFSB) to the Director of the Office of Management and Budget in which the chairman proposed abolishing or downsizing the DNFSB.  The CPI is highly critical of the chairman’s proposals; support for their position includes a list of the safety improvements in the Department of Energy (DOE) complex that have resulted from DNFSB recommendations and the safety challenges that DOE facilities continue to face.

The CPI also cites a 2014 National Nuclear Security Administration (NNSA, the DOE sub-organization that oversees the nuclear weapons facilities) internal report that describes NNSA’s own safety culture weaknesses, e.g., lack of a questioning attitude toward contractor management’s performance claims, with respect to its oversight of the Los Alamos National Laboratory.

The CPI believes the chairman is responding to pressure from the private contractors who actually manage DOE facilities to reduce outside interference in, and oversight of, contractor activities.  That’s certainly plausible.  The contractors get paid regardless of their level of performance, and very little of that pay is tied to safety performance.  DNFSB recommendations and reports can be thorns in the sides of contractor management.

The Sullivan Letter

The primary proposal in the Sullivan letter is to abolish the DNFSB because the DOE has developed its own “robust regulatory structure” and oversight capabilities via the Office of Enterprise Assessments.  That’s a hollow rationale; the CPI report discusses the insufficiency of DOE’s own assessments.  If outright elimination is not politically doable then DNFSB personnel could be transferred to DOE, sustaining the appearance of independent oversight, and then be slowly absorbed into the larger DOE organization.  That is not a path to increased public confidence and looks like being assimilated by the Borg.***  The savings that could be realized from abolishing the DNFSB is estimated at $31 million, a number lost in the decimal dust of DOE’s $30+ billion budget.

Sullivan mentions but opposes transferring the DNFSB’s oversight responsibilities to the Nuclear Regulatory Commission.  Why?  Because the NRC is not only independent, it has enforcement powers which would be inappropriate for defense nuclear facilities and might compromise national security.  That’s a red herring but we’ll let it go; we don’t think oversight of defense facilities really meshes with the NRC’s mission.

His secondary proposal is to downsize the DNFSB workforce, especially its management structure, and transfer most of the survivors to specific defense facilities.  While we think DNFSB needs more resources, not fewer, it would be better if more DNFSB personnel were located in the field, keeping track of and reporting on DOE and contractor activities.

Our Perspective

Safetymatters first became interested in the DNFSB when we saw the growing mess at the Waste Treatment Plant (WTP, aka the Vit Plant) in Hanford, WA.  It was the DNFSB who forced the DOE and its WTP contractors to confront and remediate serious nuclear safety culture (NSC) problems.  We have published multiple reports on the resultant foot-dragging by DOE in its responses to DNFSB Recommendation 2011-1 which addressed safety conscious work environment (SCWE) problems at Hanford and other DOE facilities.  Click on the DOE label to see our offerings on WTP, other DOE facilities and the overall DOE complex.
 
We have reported on the NSC problems at the Waste Isolation Pilot Plant (WIPP) in New Mexico.  The DNFSB has played an important role in attempting to get DOE and the WIPP contractor to strengthen their safety practices.  Click the WIPP label to see our WIPP-related posts. 

We have also covered a report on the DNFSB’s own organizational issues, including board members’ meddling in day-to-day activities, weak leadership and too-frequent organizational changes.  See our Feb. 6, 2015 post for details.

DNFSB’s internal issues notwithstanding, the board plays an indispensible role in strengthening NSC and safety practices throughout the DOE complex.  They should be given greater authority (which won’t happen), stronger leadership and additional resources.

Bottom line: Sullivan’s proposal is just plain nuts.  He’s a Republican appointee so maybe he’s simply offering homage to his ultimate overlord.
  

*  P. Malone and R.J. Smith, “GOP chair of nuclear safety agency secretly urges Trump to abolish it,” The Center for Public Integrity (Oct. 19, 2017).  Retrieved Oct. 26, 2017.

**  S. Sullivan (DNFSB) to J.M Mulvaney (Management and Budget), no subject specified but described as an “initial high-level draft of [an] Agency Reform Plan” (June 29, 2019).  Available from the CPI in html and pdf format.  Retrieved Oct. 26, 2017.

***  The Borg is an alien group entity in Star Trek that forcibly assimilates other beings.  See Wikipedia for more information.

Monday, October 16, 2017

Nuclear Safety Culture: A Suggestion for Integrating “Just Culture” Concepts

All of you have heard of “Just Culture” (JC).  At heart, it is an attitude toward investigating and explaining errors that occur in organizations in terms of “why” an error occurred, including systemic reasons, rather than focusing on identifying someone to blame.  How might JC be applied in practice?  A paper* by Shem Malmquist describes how JC concepts could be used in the early phases of an investigation to mitigate cognitive bias on the part of the investigators.

The author asserts that “cognitive bias has a high probability of occurring, and becoming integrated into the investigators subconscious during the early stages of an accident investigation.” 

He recommends that, from the get-go, investigators categorize all pertinent actions that preceded the error as an error (unintentional act), at-risk behavior (intentional but for a good reason) or reckless (conscious disregard of a substantial risk or intentional rule violation). (p. 5)  For errors or at-risk actions, the investigator should analyze the system, e.g., policies, procedures, training or equipment, for deficiencies; for reckless behavior, the investigator should determine what system components, if any, broke down and allowed the behavior to occur. (p. 12).  Individuals should still be held responsible for deliberate actions that resulted in negative consequences.

Adding this step to a traditional event chain model will enrich the investigation and help keep investigators from going down the rabbit hole of following chains suggested by their own initial biases.

Because JC is added to traditional investigation techniques, Malmquist believes it might be more readily accepted than other approaches for conducting more systemic investigations, e.g., Leveson’s System Theoretic Accident Model and Processes (STAMP).  Such approaches are complex, require lots of data and implementing them can be daunting for even experienced investigators.  In our opinion, these models usually necessitate hiring model experts who may be the only ones who can interpret the ultimate findings—sort of like an ancient priest reading the entrails of a sacrificial animal.  Snide comment aside, we admire Leveson’s work and reviewed it in our Nov. 11, 2013 post.

Our Perspective

This paper is not some great new insight into accident investigation but it does describe an incremental step that could make traditional investigation methods more expansive in outlook and robust in their findings.

The paper also provides a simple introduction to the works of authors who cover JC or decision-making biases.  The former category includes Reason and Dekker and the latter one Kahneman, all of whom we have reviewed here at Safetymatters.  For Reason, see our Nov. 3, 2014 post; for Dekker, see our Aug. 3, 2009 and Dec. 5, 2012 posts; for Kahneman, see our Nov. 4, 2011 and Dec. 18, 2013 posts.

Bottom line: The parts describing and justifying the author’s proposed approach are worth reading.  You are already familiar with much of the contextual material he includes.  


*  S. Malmquist, “Just Culture Accident Model – JCAM” (June 2017).

Friday, October 6, 2017

WANO and NEA to Cooperate on Nuclear Safety Culture

World Nuclear News Oct. 4, 2017
According to an item* in World Nuclear News, the World Association of Nuclear Operators (WANO) and the Organisation for Economic Co-operation and Development’s Nuclear Energy Agency (NEA) signed a memorandum of understanding to cooperate on "the further development of approaches, practices and methods in order to proactively strengthen global nuclear safety."

One objective is to “enhance the common understanding of nuclear safety culture challenges . . .”  In addition, the parties have identified safety culture (SC) as a "fundamental subject of common interest" and plan to launch a series of "country-specific discussions to explore the influence of national culture on the safety culture".

Our Perspective

As usual, the press release touts all the benefits that are going to flow from the new relationship.  We predict the flow will be at best a trickle based on what we’ve seen from the principals over the years.  Following is our take on the two entities.

WANO is an association of the world's nuclear power operators.  Their objective is to exchange safety knowledge and operating experience among its members.  We have mentioned WANO in several Safetymatters posts, including Jan. 23, 2015, Jan. 7, 2015, Jan. 21, 2014 and May 1, 2010.  Their public contributions are generally shallow and insipid.  WANO may be effective at facilitating information sharing but it has no real authority over operators.  It is, however, an overhead cost for the economically uncompetitive commercial nuclear industry. 

NEA is an intergovernmental agency that facilitates cooperation among countries with nuclear technology infrastructures.  In our March 3, 2016 post we characterized NEA as an “empty suit” that produces cheerleading and blather.  We stand by that assessment.  In Safetymatters’ history, we have come across only one example of NEA adding value—when they published a document that encouraged regulators to take a systems view of SC.  See our Feb. 10, 2016 post for details.

No one should expect this new arrangement to lead to any breakthroughs in SC theory or insights into SC practice.  It will lead to meetings, conferences, workshops and boondoggles.  One hopes it doesn’t indirectly raise the industry’s costs or, more importantly, distract WANO from its core mission of sharing safety information and operating experience across the international nuclear industry. 


*  “WANO, NEA enhance cooperation in nuclear safety,” World Nuclear News (Oct. 4, 2017).

Tuesday, September 26, 2017

“New” IAEA Nuclear Safety Culture Self-Assessment Methodology

IAEA report cover
The International Atomic Energy Agency (IAEA) touted its safety culture (SC) self-assessment methodology at the Regulatory Cooperation Forum held during the recent IAEA 61st General Conference.  Their press release* refers to the methodology as “new” but it’s not exactly fresh from the factory.  We assume the IAEA presentation was based on a publication titled “Performing Safety Culture Self-assessments”** which was published in June 2016 and we reviewed on Aug. 1, 2016.  We encourage you to read our full review; it is too lengthy to reasonably summarize in this post.  Suffice to say the publication includes some worthwhile SC information and descriptions of relevant SC assessment practices but it also exhibits some execrable shortcomings.


*  IAEA, “New IAEA Self-Assessment Methodology and Enhancing SMR Licensing Discussed at Regulatory Cooperation Forum” (Sept. 22, 2017).

**  IAEA, “Performing Safety Culture Self-assessments,” Safety Reports Series no. 83 (Vienna: IAEA, 2016).

Thursday, August 10, 2017

Nuclear Safety Culture: The Threat of Bureaucratization

We recently read Sidney Dekker’s 2014 paper* on the bureaucratization of safety in organizations.  It’s interesting because it describes a very common evolution of organizational practices, including those that affect safety, as an organization or industry becomes more complicated and formal over time.  Such evolution can affect many types of organizations, including nuclear ones.  Dekker’s paper is summarized below, followed by our perspective on it. 

The process of bureaucratization is straightforward; it involves hierarchy (creating additional layers of organizational structure), specialized roles focusing on “safety related” activities, and the application of rules for defining safety requirements and the programs to meet them.  In the safety space, the process has been driven by multiple factors, including legislation and regulation, contracting and the need for a uniform approach to managing large groups of organizations, and increased technological capabilities for collection and analysis of data.

In a nutshell, bureaucracy means greater control over the context and content of work by people who don’t actually have to perform it.  The risk is that as bureaucracy grows, technical expertise and operational experience may be held in less value.

This doesn’t mean bureaucracy is a bad thing.  In many environments, bureaucratization has led to visible benefits, primarily a reduction in harmful incidents.  But it can lead to unintended, negative consequences including:

  • Myopic focus on formal performance measures (often quantitative) and “numbers games” to achieve the metrics and, in some cases, earn financial bonuses,
  • An increasing inability to imagine, much less plan for, truly novel events because of the assumption that everything bad that might happen has already been considered in the PRA or the emergency plan.  (Of course, these analyses/documents are created by siloed specialists who may lack a complete understanding of how the socio-technical system works or what might actually be required in an emergency.  Fukushima anyone?),
  • Constraints on organizational members’ creativity and innovation, and a lack of freedom that can erode problem ownership, and
  • Interest, effort and investment in sustaining, growing and protecting the bureaucracy itself.
Our Perspective

We realize reading about bureaucracy is about as exciting as watching a frog get boiled.  However, Dekker does a good job of explaining how the process of bureaucratization takes root and grows and the benefits that can result.  He also spells out the shortcomings and unintended consequences that can accompany it.

The commercial nuclear world is not immune to this process.  Consider all the actors who have their fingers in the safety pot and realize how few of them are actually responsible for designing, maintaining or operating a plant.  Think about the NRC’s Reactor Oversight Process (ROP) and the licensees’ myopic focus on keeping a green scorecard.  Importantly, the Safety Culture Policy Statement (SCPS) being an “expectation” resists the bureaucratic imperative to over-specify.  Instead, the SCPS is an adjustable cudgel the NRC uses to tap or bludgeon wayward licensees into compliance.  Foreign interest in regulating nuclear safety culture will almost certainly lead to its increased bureaucratization.  

Bureaucratization is clearly evident in the public nuclear sector (looking at you, Department of Energy) where contractors perform the work and government overseers attempt to steer the contractors toward meeting production goals and safety standards.  As Dekker points out, managing, monitoring and controlling operations across an organizational network of contractors and sub-contractors tends to be so difficult that bureaucratized accountability becomes the accepted means to do so.

We have presented Dekker’s work before, primarily his discussion of a “just culture” (reviewed Aug. 3, 2009) that tries to learn from mishaps rather than simply isolating and perhaps punishing the human actor(s) and “drift into failure” (reviewed Dec. 5, 2012) where a socio-technical system can experience unacceptable performance caused by systemic interactions while functioning normally.  Stakeholders can mistakenly believe the system is completely safe because no errors have occurred while in reality the system can be slipping toward an incident.  Both of these attributes should be considered in your mental model of how your organization operates.

Bottom line: This is an academic paper in a somewhat scholarly journal, in other words, not a quick and easy read.  But it’s worth a look to get a sense of how the tentacles of formality can wrap themselves around an organization.  In the worse case, they can stifle the capabilities the organization needs to successfully react to unexpected events and environmental changes.


*  S.W.A. Dekker, “The bureaucratization of safety,” Safety Science 70 (2014), pp. 348–357.  We saw this paper on Safety Differently, a website that publishes essays on safety.  Most of the site’s content appears related to industries with major industrial safety challenges, e.g., mining.