Friday, December 1, 2017

Nuclear Safety Culture: Focus on Decision Making



McKinsey Five Fifty cover
We have long held that decision making (DM) is a key artifact reflecting the state of a nuclear organization’s safety culture.

The McKinsey Quarterly (MQ) has packaged a trio of articles* on DM.  Their first purpose is identifying and countering the different biases that lead to sub-optimal, even disastrous decisions.  (When specific biases are widely spread in an organization, they are part of its culture.)  A second purpose is to describe the attributes of more fair, robust and effective DM processes.  The articles’ specific topics are (1) the behavioral science that underlies DM, (2) a method for categorizing and processing decisions and (3) a case study of a major utility that changed its decision culture. 

“The case for behavioral strategy” (MQ, March 2010)

This article covers the insights from psychology that can be used to fashion a robust DM process.  The authors evidence the need for process improvement by reporting their survey research results showing over 50 percent of the variability in decisional results (i.e., performance) was determined by the quality of the DM process while less than 10 percent was caused by the quality of the underlying analysis. 

There are plenty of cognitive biases that can affect human DM.  The authors discuss several of them and strategies for counteracting them, as summarized in the table below.


Type of bias
How to counteract
False pattern recognition (e.g., saliency (overweight recent or memorable events), confirmation, inaccurate analogies)
Require alternative explanations for the data in the analysis, articulate participants’ relevant experiences (which can reveal the basis for their biases), identify similar situations for comparative analysis.
Bias for action
Explicitly consider uncertainty in the input data and the possible outcomes.
Stability (anchoring to an initial value, loss aversion)
Establish stretch targets that can’t be achieved by business as usual.
Silo thinking
Involve a diverse group in the DM process and define specific decision criteria before discussions begin.
Social (conformance to group views)
Create genuine debate through a diverse set of decision makers, a climate of trust and depersonalized discussions.


The greatest problem arises from biases that create repeatable patterns that become undesirable cultural traits.  DM process designers must identify the types of biases that arise in their organization’s DM, and specify debiasing techniques that will work in their organization and embed them in formal DM procedures.

An attachment to the article identifies and defines 17 specific biases.  Much of the seminal research on DM biases was performed by Daniel Kahneman who received a Nobel prize for his efforts.  We have reviewed Prof. Kahneman’s work on Safetymatters; see our Nov. 4, 2011 and Dec. 18, 2013 posts or click on the Kahneman label. 

“Untangling your organization’s decision making” (MQ, June 2017)

While this article is aimed at complex, global organizations, there are lessons here for nuclear organizations (typically large bureaucracies) because all organizations have become victims of over-abundant communications, with too many meetings and low value e-mail threads distracting members from paying attention to making good decisions.

The authors posit four types of decisions an organization faces, plotted on a 2x2 matrix (the consultant’s best friend) with scope and impact (broad or narrow) on one axis and level of familiarity (infrequent or frequent) on the other.  A different DM approach is proposed for each quadrant. 

Big-bet decisions are infrequent and have broad impact.  Recommendations include (1) ensure there’s an executive sponsor, (2) break down the mega-decision into manageable parts for analysis (and reassemble them later), (3) use a standard DM approach for all the parts and (4) establish a mechanism to track effectiveness during decision implementation.

The authors observe that some decisions turn out to be “bet the company” ones without being recognized as such.  There are examples of this in the nuclear industry.  For details, see our June 18,2013 post on Kewaunee (had only an 8 year PPA), Crystal River (tried to cut through the containment using in-house expertise) and SONGs (installed replacement steam generators with an unproven design). 

Cross-cutting decisions are more frequent and have broad impact.  Some decisions at a nuclear power plant fall into this category.  They need to have the concurrence and support of the Big 3 stakeholders (Operations, Engineering and Maintenance).  Silo attitudes are an omnipresent threat to success in making these kinds of decisions.  The key is to get the stakeholders to agree on the main process steps and define them in a plain-English procedure that defines the calendar, handoffs and decisions.  Governing policy should establish the DM bodies and their authority, and define shared performance metrics to measure success. 

Delegated decisions are frequent and low-risk.  They can be effectively handled by an individual or working team, with limited input from others.  The authors note “The role-modeling of senior leaders is invaluable, but they may be reluctant” to delegate.  We agree.  In our experience, many nuclear managers were hesitant to delegate as many decisions as they could have to subordinates.  Their fear of being held accountable for a screw-up was just too great.  However, their goal should have been to delegate all decisions except those for which they alone had the capabilities and accountability.  Subordinates need appropriate training and explicit authority to make their decisions and they need to be held accountable by higher-level managers.  The organization needs to establish a clear policy defining when and how a decision should be elevated to a more senior decision maker. 

Ad hoc decisions are infrequent and low-risk; they were deliberately omitted from the article. 

“A case study in combating bias” (MQ, May 2017)

This is an interview with a senior executive of a German utility that invested €10 billion in conventional power projects, investments that failed when the political-economic environment evolved in a direction opposite to their assumptions.  In their postmortem, they realized they had succumbed to several cognitive biases, including status quo, confirmation, champion and sunflower.  The sunflower bias (groups aligning with their leaders) stretched far down the organizational hierarchy so lower-level analysts didn’t dare to suggest contrary assumptions or outcomes.

The article describes how the utility made changes to their DM practices to promote awareness of biases and implement debiasing techniques, e.g, one key element is officially designated “devil’s advocates” in DM groups.  Importantly, training emphasizes that biases are not some personal defect but “just there,” i.e., part of the culture.  The interviewee noted that the revised process is very time-intensive so it is utilized only for the most important decisions facing each user group. 

Our Perspective 

The McKinsey content describes executive level, strategic DM but many of the takeaways are equally applicable to decisions made at the individual, department and inter-department level, where a consistent approach is perhaps even more important in maintaining or improving organizational performance.

The McKinsey articles come in one of their Five Fifty packages, with a summary you can review in five minutes and the complete articles that may take fifty minutes total.  You should invest at least the smaller amount.


*  “Better Decisions,” McKinsey Quarterly Five Fifty.  Retrieved Nov. 28, 2017.

Tuesday, November 21, 2017

Any Lessons for Nuclear Safety Culture from VW’s Initiative to Improve Its Compliance Culture?

VW Logo (Source: Wikipedia)
The Wall Street Journal (WSJ) recently published an interview* with the head of the new compliance department in Volkswagen’s U.S. subsidiary.  The new executive outlined the department’s goals and immediate actions related to improving VW’s compliance culture.  They will all look familiar to you, including a new organization (headed by a former consultant) reporting directly to the CEO and with independent access to the board; mandatory compliance training; a new code of conduct; and developing a questioning attitude among employees.  One additional attribute deserves a brief expansion.  VW aims to improve employees’ decision making skills.  We’re not exactly sure what that means but if it includes providing more information about corporate policies and legal, social and regulatory expectations (in other words, the context of decisions) then we approve.

Our Perspective 


These interventions could be from a first generation nuclear safety culture (NSC) handbook on efforts to demonstrate management interest and action when a weak culture is recognized.  Such activities are necessary but definitely not sufficient to strengthen culture.  Some specific shortcomings follow.

First, the lack of reflection.  When asked about the causes of VW’s compliance failures, the executive said “I can’t speculate on the failures . . .”  Well, she should have had something to say on the matter, even party line bromides.  We’re left with the impression she doesn’t know, or care, about the specific and systemic causes of VW’s “Dieselgate” problems that are costing the company tens of billions of dollars.  After all, this interview was in the WSJ, available to millions of critical readers, not some trade rag.

Second, the trust issue.  VW wants employees who can be trusted by the organization, presumably to do “the right thing” as they go about their business.  That’s OK but it’s even more important to have senior managers who can be trusted to do the right thing.  This is especially relevant for VW because it’s pretty clear the cheating problems were tolerated, if not explicitly promoted, by senior management; in other words, there was a top-down issue in addition to lower-level employee malfeasance.

Next, the local nature of the announced interventions.  The new compliance department is for VW-USA only.  The Volkswagen Group of America includes one assembly plant, sales and maintenance support functions, test centers and VW’s consumer finance entity.  It’s probably safe to say that VW’s most important decisions regarding corporate practices and product engineering are made in Wolfsburg, Lower Saxony and not Herndon, Virginia.

Finally, the elephant in the room.  There is no mention of VW’s employee reward and recognition system or the senior management compensation program.  We have long argued that employees focus on actions that will secure their jobs (and perhaps lead to promotions) while senior managers focus on what they’re being paid to accomplish.  For the latter group in the nuclear industry, that’s usually production with safety as a should-do but with little, if any, money attached.  We don’t believe VW is significantly different.

Bottom line: If this WSJ interview is representative of the auto industry’s understanding of culture, then once again nuclear industry thought leaders have a more sophisticated and complete grasp of cultural dynamics and nuances.

We have commented before on the VW imbroglio.  See our Dec. 20, 2015 and May 31, 2016 posts or click on the VW label.


*B. DiPietro, “Working to Change Compliance Culture at Volkswagen,” Wall Street Journal (Nov. 16, 2017).

Monday, October 30, 2017

Nuclear Safety Culture Under Assault: DNFSB Chairman Proposes Eliminating the Board


DNFSB headquarters
The Center for Public Integrity (CPI) recently published a report* that disclosed a private letter** from Sean Sullivan, the Chairman of the Defense Nuclear Facilities Safety Board (DNFSB) to the Director of the Office of Management and Budget in which the chairman proposed abolishing or downsizing the DNFSB.  The CPI is highly critical of the chairman’s proposals; support for their position includes a list of the safety improvements in the Department of Energy (DOE) complex that have resulted from DNFSB recommendations and the safety challenges that DOE facilities continue to face.

The CPI also cites a 2014 National Nuclear Security Administration (NNSA, the DOE sub-organization that oversees the nuclear weapons facilities) internal report that describes NNSA’s own safety culture weaknesses, e.g., lack of a questioning attitude toward contractor management’s performance claims, with respect to its oversight of the Los Alamos National Laboratory.

The CPI believes the chairman is responding to pressure from the private contractors who actually manage DOE facilities to reduce outside interference in, and oversight of, contractor activities.  That’s certainly plausible.  The contractors get paid regardless of their level of performance, and very little of that pay is tied to safety performance.  DNFSB recommendations and reports can be thorns in the sides of contractor management.

The Sullivan Letter

The primary proposal in the Sullivan letter is to abolish the DNFSB because the DOE has developed its own “robust regulatory structure” and oversight capabilities via the Office of Enterprise Assessments.  That’s a hollow rationale; the CPI report discusses the insufficiency of DOE’s own assessments.  If outright elimination is not politically doable then DNFSB personnel could be transferred to DOE, sustaining the appearance of independent oversight, and then be slowly absorbed into the larger DOE organization.  That is not a path to increased public confidence and looks like being assimilated by the Borg.***  The savings that could be realized from abolishing the DNFSB is estimated at $31 million, a number lost in the decimal dust of DOE’s $30+ billion budget.

Sullivan mentions but opposes transferring the DNFSB’s oversight responsibilities to the Nuclear Regulatory Commission.  Why?  Because the NRC is not only independent, it has enforcement powers which would be inappropriate for defense nuclear facilities and might compromise national security.  That’s a red herring but we’ll let it go; we don’t think oversight of defense facilities really meshes with the NRC’s mission.

His secondary proposal is to downsize the DNFSB workforce, especially its management structure, and transfer most of the survivors to specific defense facilities.  While we think DNFSB needs more resources, not fewer, it would be better if more DNFSB personnel were located in the field, keeping track of and reporting on DOE and contractor activities.

Our Perspective

Safetymatters first became interested in the DNFSB when we saw the growing mess at the Waste Treatment Plant (WTP, aka the Vit Plant) in Hanford, WA.  It was the DNFSB who forced the DOE and its WTP contractors to confront and remediate serious nuclear safety culture (NSC) problems.  We have published multiple reports on the resultant foot-dragging by DOE in its responses to DNFSB Recommendation 2011-1 which addressed safety conscious work environment (SCWE) problems at Hanford and other DOE facilities.  Click on the DOE label to see our offerings on WTP, other DOE facilities and the overall DOE complex.
 
We have reported on the NSC problems at the Waste Isolation Pilot Plant (WIPP) in New Mexico.  The DNFSB has played an important role in attempting to get DOE and the WIPP contractor to strengthen their safety practices.  Click the WIPP label to see our WIPP-related posts. 

We have also covered a report on the DNFSB’s own organizational issues, including board members’ meddling in day-to-day activities, weak leadership and too-frequent organizational changes.  See our Feb. 6, 2015 post for details.

DNFSB’s internal issues notwithstanding, the board plays an indispensible role in strengthening NSC and safety practices throughout the DOE complex.  They should be given greater authority (which won’t happen), stronger leadership and additional resources.

Bottom line: Sullivan’s proposal is just plain nuts.  He’s a Republican appointee so maybe he’s simply offering homage to his ultimate overlord.
  

*  P. Malone and R.J. Smith, “GOP chair of nuclear safety agency secretly urges Trump to abolish it,” The Center for Public Integrity (Oct. 19, 2017).  Retrieved Oct. 26, 2017.

**  S. Sullivan (DNFSB) to J.M Mulvaney (Management and Budget), no subject specified but described as an “initial high-level draft of [an] Agency Reform Plan” (June 29, 2019).  Available from the CPI in html and pdf format.  Retrieved Oct. 26, 2017.

***  The Borg is an alien group entity in Star Trek that forcibly assimilates other beings.  See Wikipedia for more information.

Monday, October 16, 2017

Nuclear Safety Culture: A Suggestion for Integrating “Just Culture” Concepts

All of you have heard of “Just Culture” (JC).  At heart, it is an attitude toward investigating and explaining errors that occur in organizations in terms of “why” an error occurred, including systemic reasons, rather than focusing on identifying someone to blame.  How might JC be applied in practice?  A paper* by Shem Malmquist describes how JC concepts could be used in the early phases of an investigation to mitigate cognitive bias on the part of the investigators.

The author asserts that “cognitive bias has a high probability of occurring, and becoming integrated into the investigators subconscious during the early stages of an accident investigation.” 

He recommends that, from the get-go, investigators categorize all pertinent actions that preceded the error as an error (unintentional act), at-risk behavior (intentional but for a good reason) or reckless (conscious disregard of a substantial risk or intentional rule violation). (p. 5)  For errors or at-risk actions, the investigator should analyze the system, e.g., policies, procedures, training or equipment, for deficiencies; for reckless behavior, the investigator should determine what system components, if any, broke down and allowed the behavior to occur. (p. 12).  Individuals should still be held responsible for deliberate actions that resulted in negative consequences.

Adding this step to a traditional event chain model will enrich the investigation and help keep investigators from going down the rabbit hole of following chains suggested by their own initial biases.

Because JC is added to traditional investigation techniques, Malmquist believes it might be more readily accepted than other approaches for conducting more systemic investigations, e.g., Leveson’s System Theoretic Accident Model and Processes (STAMP).  Such approaches are complex, require lots of data and implementing them can be daunting for even experienced investigators.  In our opinion, these models usually necessitate hiring model experts who may be the only ones who can interpret the ultimate findings—sort of like an ancient priest reading the entrails of a sacrificial animal.  Snide comment aside, we admire Leveson’s work and reviewed it in our Nov. 11, 2013 post.

Our Perspective

This paper is not some great new insight into accident investigation but it does describe an incremental step that could make traditional investigation methods more expansive in outlook and robust in their findings.

The paper also provides a simple introduction to the works of authors who cover JC or decision-making biases.  The former category includes Reason and Dekker and the latter one Kahneman, all of whom we have reviewed here at Safetymatters.  For Reason, see our Nov. 3, 2014 post; for Dekker, see our Aug. 3, 2009 and Dec. 5, 2012 posts; for Kahneman, see our Nov. 4, 2011 and Dec. 18, 2013 posts.

Bottom line: The parts describing and justifying the author’s proposed approach are worth reading.  You are already familiar with much of the contextual material he includes.  


*  S. Malmquist, “Just Culture Accident Model – JCAM” (June 2017).

Friday, October 6, 2017

WANO and NEA to Cooperate on Nuclear Safety Culture

World Nuclear News Oct. 4, 2017
According to an item* in World Nuclear News, the World Association of Nuclear Operators (WANO) and the Organisation for Economic Co-operation and Development’s Nuclear Energy Agency (NEA) signed a memorandum of understanding to cooperate on "the further development of approaches, practices and methods in order to proactively strengthen global nuclear safety."

One objective is to “enhance the common understanding of nuclear safety culture challenges . . .”  In addition, the parties have identified safety culture (SC) as a "fundamental subject of common interest" and plan to launch a series of "country-specific discussions to explore the influence of national culture on the safety culture".

Our Perspective

As usual, the press release touts all the benefits that are going to flow from the new relationship.  We predict the flow will be at best a trickle based on what we’ve seen from the principals over the years.  Following is our take on the two entities.

WANO is an association of the world's nuclear power operators.  Their objective is to exchange safety knowledge and operating experience among its members.  We have mentioned WANO in several Safetymatters posts, including Jan. 23, 2015, Jan. 7, 2015, Jan. 21, 2014 and May 1, 2010.  Their public contributions are generally shallow and insipid.  WANO may be effective at facilitating information sharing but it has no real authority over operators.  It is, however, an overhead cost for the economically uncompetitive commercial nuclear industry. 

NEA is an intergovernmental agency that facilitates cooperation among countries with nuclear technology infrastructures.  In our March 3, 2016 post we characterized NEA as an “empty suit” that produces cheerleading and blather.  We stand by that assessment.  In Safetymatters’ history, we have come across only one example of NEA adding value—when they published a document that encouraged regulators to take a systems view of SC.  See our Feb. 10, 2016 post for details.

No one should expect this new arrangement to lead to any breakthroughs in SC theory or insights into SC practice.  It will lead to meetings, conferences, workshops and boondoggles.  One hopes it doesn’t indirectly raise the industry’s costs or, more importantly, distract WANO from its core mission of sharing safety information and operating experience across the international nuclear industry. 


*  “WANO, NEA enhance cooperation in nuclear safety,” World Nuclear News (Oct. 4, 2017).

Tuesday, September 26, 2017

“New” IAEA Nuclear Safety Culture Self-Assessment Methodology

IAEA report cover
The International Atomic Energy Agency (IAEA) touted its safety culture (SC) self-assessment methodology at the Regulatory Cooperation Forum held during the recent IAEA 61st General Conference.  Their press release* refers to the methodology as “new” but it’s not exactly fresh from the factory.  We assume the IAEA presentation was based on a publication titled “Performing Safety Culture Self-assessments”** which was published in June 2016 and we reviewed on Aug. 1, 2016.  We encourage you to read our full review; it is too lengthy to reasonably summarize in this post.  Suffice to say the publication includes some worthwhile SC information and descriptions of relevant SC assessment practices but it also exhibits some execrable shortcomings.


*  IAEA, “New IAEA Self-Assessment Methodology and Enhancing SMR Licensing Discussed at Regulatory Cooperation Forum” (Sept. 22, 2017).

**  IAEA, “Performing Safety Culture Self-assessments,” Safety Reports Series no. 83 (Vienna: IAEA, 2016).

Thursday, August 10, 2017

Nuclear Safety Culture: The Threat of Bureaucratization

We recently read Sidney Dekker’s 2014 paper* on the bureaucratization of safety in organizations.  It’s interesting because it describes a very common evolution of organizational practices, including those that affect safety, as an organization or industry becomes more complicated and formal over time.  Such evolution can affect many types of organizations, including nuclear ones.  Dekker’s paper is summarized below, followed by our perspective on it. 

The process of bureaucratization is straightforward; it involves hierarchy (creating additional layers of organizational structure), specialized roles focusing on “safety related” activities, and the application of rules for defining safety requirements and the programs to meet them.  In the safety space, the process has been driven by multiple factors, including legislation and regulation, contracting and the need for a uniform approach to managing large groups of organizations, and increased technological capabilities for collection and analysis of data.

In a nutshell, bureaucracy means greater control over the context and content of work by people who don’t actually have to perform it.  The risk is that as bureaucracy grows, technical expertise and operational experience may be held in less value.

This doesn’t mean bureaucracy is a bad thing.  In many environments, bureaucratization has led to visible benefits, primarily a reduction in harmful incidents.  But it can lead to unintended, negative consequences including:

  • Myopic focus on formal performance measures (often quantitative) and “numbers games” to achieve the metrics and, in some cases, earn financial bonuses,
  • An increasing inability to imagine, much less plan for, truly novel events because of the assumption that everything bad that might happen has already been considered in the PRA or the emergency plan.  (Of course, these analyses/documents are created by siloed specialists who may lack a complete understanding of how the socio-technical system works or what might actually be required in an emergency.  Fukushima anyone?),
  • Constraints on organizational members’ creativity and innovation, and a lack of freedom that can erode problem ownership, and
  • Interest, effort and investment in sustaining, growing and protecting the bureaucracy itself.
Our Perspective

We realize reading about bureaucracy is about as exciting as watching a frog get boiled.  However, Dekker does a good job of explaining how the process of bureaucratization takes root and grows and the benefits that can result.  He also spells out the shortcomings and unintended consequences that can accompany it.

The commercial nuclear world is not immune to this process.  Consider all the actors who have their fingers in the safety pot and realize how few of them are actually responsible for designing, maintaining or operating a plant.  Think about the NRC’s Reactor Oversight Process (ROP) and the licensees’ myopic focus on keeping a green scorecard.  Importantly, the Safety Culture Policy Statement (SCPS) being an “expectation” resists the bureaucratic imperative to over-specify.  Instead, the SCPS is an adjustable cudgel the NRC uses to tap or bludgeon wayward licensees into compliance.  Foreign interest in regulating nuclear safety culture will almost certainly lead to its increased bureaucratization.  

Bureaucratization is clearly evident in the public nuclear sector (looking at you, Department of Energy) where contractors perform the work and government overseers attempt to steer the contractors toward meeting production goals and safety standards.  As Dekker points out, managing, monitoring and controlling operations across an organizational network of contractors and sub-contractors tends to be so difficult that bureaucratized accountability becomes the accepted means to do so.

We have presented Dekker’s work before, primarily his discussion of a “just culture” (reviewed Aug. 3, 2009) that tries to learn from mishaps rather than simply isolating and perhaps punishing the human actor(s) and “drift into failure” (reviewed Dec. 5, 2012) where a socio-technical system can experience unacceptable performance caused by systemic interactions while functioning normally.  Stakeholders can mistakenly believe the system is completely safe because no errors have occurred while in reality the system can be slipping toward an incident.  Both of these attributes should be considered in your mental model of how your organization operates.

Bottom line: This is an academic paper in a somewhat scholarly journal, in other words, not a quick and easy read.  But it’s worth a look to get a sense of how the tentacles of formality can wrap themselves around an organization.  In the worse case, they can stifle the capabilities the organization needs to successfully react to unexpected events and environmental changes.


*  S.W.A. Dekker, “The bureaucratization of safety,” Safety Science 70 (2014), pp. 348–357.  We saw this paper on Safety Differently, a website that publishes essays on safety.  Most of the site’s content appears related to industries with major industrial safety challenges, e.g., mining.

Thursday, July 27, 2017

Nuclear Safety Culture: Another Incident at Pilgrim: Tailgate Party

Pilgrim
The Cape Cod Times recently reported* on a security violation at the Pilgrim nuclear plant: one employee entering a secure area facilitated “tailgating” by a second employee who had forgotten his badge.  He didn’t want to go to Security to obtain clearance for entry because that would make him late for work.

The NRC determined the pair were deliberately taking a shortcut but were not attempting to do something malicious.  The NRC investigation also revealed that other personnel, including security, had utilized the same shortcut in the past to allow workers to exit the plant.  The result of the investigation was a Level IV violation for the plant.

Of course, the plant’s enemies are on this like a duck on a June bug, calling the incident alarming and further evidence for immediate shutdown of the plant.  Entergy, the plant’s owner, is characterized as indifferent to such activities. 

The article’s high point was reporting that the employee who buzzed in his fellow worker told investigators “he did not know he was not allowed to do that”.

Our Perspective 


The incident itself was a smallish deal, not a big one.  But it does score a twofer because it reflects on both safety culture and security culture.  Whichever category it goes in, the incident is a symptom of a poorly managed plant and a culture that has long tolerated shortcuts.  It is one more drop in the bucket as Pilgrim shuffles** toward the exit.

This case raises many questions: What kind of training, including refresher training, does staff receive about security procedures?  What kind of oversight, reminders, reinforcement and role modeling do they get from their supervisors and higher-level managers?  Why was the second employee reluctant to take the time to follow the correct procedure?  Would he have been disciplined, or even fired, for being late?  We would hope Pilgrim management doesn’t put everyone who forgets his badge in the stocks, or worse.

Bottom line: Feel bad for the people who have to work in the Pilgrim environment, be glad it’s not you or your workplace.


*  C. Legere, “NRC: Pilgrim workers ‘deliberately’ broke rules,” Cape Cod Times (July 24, 2017).  Retrieved July 26, 2017

**  In this instance, “shuffle” has both its familiar meaning of “dragging one's feet” and a less-used definition of “avoid a responsibility or obligation.”  Google dictionary retrieved July 27, 2017.

Wednesday, July 12, 2017

Nuclear Safety Culture (and Other) Problems in the U.S. Nuclear Weapons Complex

Los Alamos  Source: LANL
The Center for Public Integrity (CPI) has published a five-part report on safety lapses in the U.S. nuclear weapons complex—an array of facilities overseen by the Department of Energy (DOE).*  Overall, the report paints a picture of a challenged and arguably weak safety culture (SC).  Following is a summary of the report and our perspective on it.

Part I traces the history of radioactive criticality incidents (which have resulted in human fatalities) and near-misses at Los Alamos National Laboratory (LANL).  Analysis and production of plutonium pits, essential for maintaining the U.S. nuclear weapons inventory, has been halted for years because of concerns over safety issues.  In addition, almost all members of the site’s criticality analysis team quit over inadequate management support for the team’s efforts.

Part II discusses in more detail the impacts of the LANL shutdown.  Most significant, from our perspective, is a 2013 report that said “Management has not yet fully embraced its commitment to criticality safety.”  The 2013 report “also listed nine weaknesses in the lab’s safety culture that were rooted in a “production focus” to meet work deadlines. Workers say these deadlines are typically linked to financial bonuses.”

Speaking of bonuses, although the plant was not working, the contractors were judged to have exceeded expectations in getting ready to restart.  Accordingly, the contractors “received 74 percent or $10.7 million of the $14.4 million in profits available to them from the NNSA in the category that includes pit production and surveillance”

Part III covers incidents at other facilities and cultural shortcomings in the weapons complex.  It is the meatiest section of the report.  Most of the unfortunate events were industrial accidents (electric shocks, explosions, burns) but the nuclear hazard is always nearby because of the nature of the work.  Occasionally the nuclear factor is key, e.g., when LANL improperly packed a drum of waste they shipped to the Waste Isolation Pilot Plant where it exploded or when Nevada National Security Site personnel inhaled radioactive particles

This section captures the key point of the entire report: the DOE contractors make a lot of money ($2B in profit over the last 10 years), the financial rewards for safety are minimal and the financial penalties for accidents and such are minimal (1-3% of profits) and often waived.

Part IV details a 2014 incident in Nevada where over 30 personnel inhaled potentially cancer-causing uranium particles during laboratory experiments over a two-month period.  The researchers were annoyed by radiation alarms so they switched them off (which also turned off a safety ventilation system).  This was a self-inflicted wound that suggests a weak SC.

Part V focuses on a radiation exposure accident at the Idaho National Laboratory.  The accident occurred even though years before, the head of the safety committee had warned DOE managers about the hazards of handling the specific material involved in the accident.  The lab contractor made 92% of its contractually available profit that year.  The contractor has petitioned DOE to reimburse the contractor’s litigation expenses (including payouts to affected employees) associated with the accident.

NNSA’s Response

The National Nuclear Security Administration (NNSA) is a semi-autonomous agency within DOE that oversees U.S. nuclear weapons work.  In a statement** responding to the CPI report, the NNSA Administrator basically says the CPI report is incomplete and misleading with respect to LANL.  Unsurprisingly, he starts with “Safety is paramount . . . . [CPI] attacks the safety culture at . . .  (LANL) without offering all of the facts and the full context.”  However, he does not directly refute the CPI report, instead he provides the NNSA’s version of history: LANL paused operations because of concerns with the criticality safety program. Since then, “LANL has increased criticality safety staffing and demonstrated improvements in its performance of operational tasks.”  NNSA has withheld $82 million in fee payments to LANL.  Finally, LANL maintained its ability to fulfill its mission during the pause in operations.  Alternative facts?  You be the judge. 

Our Perspective 


The DOE says it wants safe production but is not willing to wield the hammer (higher financial incentives for safety and more penalties for unsafety) to drive that outcome.  In addition, DOE, constrained by Congress (which is bowing to their defense industry contributors), appears to deliberately understaff their own auditors and other procurement officials so they are unable to surface too many embarrassing problems. 

The contractors are rational.  They understand that production is the primary goal and they accept that bad things will occasionally happen in a hazardous environment.  They know they will make their profits no matter what happens, including facility shutdowns, because they can get paid for fixing problems they helped to create.

The CPI report is not shocking to us and it shouldn’t be to you.  (Click on the DOE label to see our many posts on DOE SC.)  It merely documents what has been, and continues to be, business as usual at nuclear weapons facilities.  If you can tolerate the overwrought writing, Part III is worth a look.           


*  The Center for Public Integrity, “Nuclear Negligence” (June 28, 2017).  Retrieved July 5, 2017.  According to Wikipedia, CPI “is an American nonprofit investigative journalism organization . . .”

The report describes problems at the Idaho National Laboratory and some NNSA facilities.  Overall, NNSA oversees eight sites that are involved with nuclear weapons: Kansas City National Security Campus (non-nuclear component manufacture), Lawrence Livermore National Laboratory (weapon design), Los Alamos National Laboratory (design and testing), Nevada National Security Site (testing), Pantex Plant (weapon assembly and disassembly), Sandia National Laboratories (non-nuclear component design), Savannah River Site (nuclear materials) and Y-12 National Security Complex (uranium components).

**  “Klotz Responds To Center For Public Integrity's Series On Safety Culture At NNSA Sites,” Los Alamos Daily Post (June 20, 2017).  Retrieved July 10, 2017

Tuesday, June 20, 2017

Learning About Nuclear Safety Culture from the Web, Maybe

The Internet  Source:Wikipedia
We’ve come across some Internet content (one website, one article) that purports to inform the reader about nuclear safety culture (NSC).  This post reviews the content and provides our perspective on its value.

NSC Website

It appears the title of this site is “Nuclear Safety Culture”* and the primary target is journalists who want an introduction to NSC concepts, history and issues.  It is a product of a group of European entities.  It is a professional looking site that covers four major topics; we’ll summarize them in some detail to show their wide scope and shallow depth. 

Nuclear Safety Culture covers five sub-topics:

History traces the shift in attitudes toward and protection from ionizing radiation as the possible consequences became better known but the story ends in the 1950s.  Key actions describe the roles of internal and external stakeholders during routine operations and emergency situations.  The focus is on power production although medicine, industrial uses and weapons are also mentioned.  Definition of NSC starts with INSAG (esp. INSAG-4), then adds INPO’s directive to emphasize safety over competing goals, and a familiar list of attributes from the Nuclear Safety Journal.  As usual, there is nothing in the attributes about executive compensation or the importance of a systems view.  IAEA safety principles are self explanatory.  Key scientific concepts cover the units of radiation for dose, intake and exposure.  Some values are shown for typical activities but only one legal limit, for US airport X-rays, is included.**  There is no information in this sub-topic on how much radiation a person can tolerate or the regulatory limits for industrial exposure.

From Events to Accidents has two sub-topics:

From events to accidents describes the 7-level International Nuclear Event Scale (from a minor anomaly to major accident) but the scale itself is not shown.  This is a major omission.  Defence in depth discusses this important concept but provides only one example, the levels of physical protection between a fuel rod in a reactor and the environment outside the containment.

Controversies has two sub-topics:

Strengths and Weaknesses discuss some of the nuclear industry’s issues and characteristics: industry transparency is a double-edge sword, where increased information on events may be used to criticize a plant owner; general radiation protection standards for the industry; uncertainties surrounding the health effects of low radiation doses; the usual nuclear waste issues; technology evolution through generations of reactors; stress tests for European reactors; supply chain realities where a problem anywhere is used against the entire industry; the political climate, focusing on Germany and France; and energy economics that have diminished nuclear’s competitiveness.  Overall, this is a hodgepodge of topics and a B- discussion.  The human factor provides a brief discussion of the “blame culture” and the need for a systemic view, followed by summaries of the Korean and French document falsification events.

Stories summarizes three events: the Brazilian theft of a radioactive source, Chernobyl and Fukushima.  They are all reported in an overly dramatic style although the basic facts are probably correct.

The authors describe what they call the “safety culture breach” for each event.  The problem is they comingle overarching cultural issues, e.g., TEPCO’s overconfident management, with far more specific failures, e.g., violations of safety and security rules, and consequences of weak NSC, e.g., plant design inadequacies.  It makes one wonder if the author(s) of this section have a clear notion of what NSC is.

It isn’t apparent how helpful this site will be for newbie journalists, it is certainly not a complete “toolkit.”  Some topics are presented in an over-simplified manner and others are missing key figures.  In terms of examples, the site emphasizes major accidents (the ultimate trailing indicators) and ignores the small events, normalization of deviance, organizational drift and other dynamics that make up the bulk of daily life in an organization.  Overall, the toolkit looks a bit like a rush job or unedited committee work, e.g., the section on the major accidents is satisfactory but others are incomplete.  Importantly (or perhaps thankfully) the authors offer no original observations or insights with respect to NSC.  It’s worrisome that what the site creators call NSC is often just the safety practices that evolved as the hazards of radiation became better known. 

NSC Article

There is an article on NSC in the online version of Power magazine.  We are not publishing a link to the article because it isn’t very good; it looks more like a high schooler’s Internet-sourced term paper than a thoughtful reference or essay on NSC.

However, like the stopped clock that shows the correct time twice per day, there can be a worthwhile nugget in such an article.  After summarizing a research paper that correlated plants’ performance indicators with assessments of their NSC attributes (which paper we reviewed on Oct. 5, 2014), the author says “There are no established thresholds for determining whether a safety culture is “healthy” or “unhealthy.””  That’s correct.  After NSC assessors consolidate their interviews, focus groups, observations, surveys and document reviews, they always identify some improvement opportunities but the usual overall grade is “pass.”***  There’s no point score, meter or gauge.  Perhaps there should be.

Our Perspective

Don’t waste your time with pap.  Go to primary sources; an excellent starting point is the survey of NSC literature performed by a U.S. National Laboratory (which we reviewed on Feb. 10, 2013.)  Click on our References label to get other possibilities and follow folks who actually know something about NSC, like Safetymatters.


Nuclear Safety Culture was developed as part of the NUSHARE project under the aegis of the European Nuclear Education Network.   Retrieved June 19, 2017.

**  The airport X-ray limit happens to be the same as the amount of radiation emitted by an ordinary banana.

***  A violation of the Safety Conscious Work Environment (SCWE) regulations is quite different.  There it’s zero tolerance and if there’s a credible complaint about actual retaliation for raising a safety issue, the licensee is in deep doo-doo until they convince the regulator they have made the necessary adjustments in the work environment.

Friday, May 26, 2017

Nuclear Safety Culture Update at Pilgrim and Watts Bar

Pilgrim

Watts Bar
A couple of recent reports address the nuclear safety culture (NSC) problems at Pilgrim and Watts Bar.  This post summarizes the reports and provides our perspective on their content.  Spoiler alert: there is not much new in this news.

Pilgrim

The NRC issued their report* on phase C of their IP 95003 inspection at Pilgrim.  This is the phase where the NRC conducts its own assessment of the plant’s NSC.  The overall finding in the cover letter is: “The NRC determined that programs and processes at PNPS [Pilgrim] adequately support nuclear safety and that PNPS should remain in Column 4.”  However, the letter goes on to detail a host of deficiencies.  The relative good news is that Pilgrim’s NSC shortcomings weren’t sufficiently serious or interesting to merit mention in the cover letter.

But the NRC had plenty to say about NSC in the main report.  Highlights include the finding that NSC is a “fundamental problem” at Pilgrim.  NSC gradually deteriorated over time and “actions to balance competing priorities, manage problems, and prioritize workload resulted in reduced safety margins.”  Staffing reduction initiatives exacerbated plant performance problems.  Personnel were challenged to exhibit standards and expectations in conservative decision-making, work practices, and procedure use and adherence.  Contributing factors to performance shortcomings include lack of effective benchmarking of industry standards and the plant’s planned 2019 permanent shutdown.  The NRC also noted weaknesses in the Executive Review Board, Employee Concerns Program and the Nuclear Safety Culture Monitoring Panel. (pp. 8-10)

Watts Bar

In April the TVA inspector general (IG) issued a report** castigating TVA management for allowing a chilled work environment (CWE) to continue to exist at Watts Bar.  The IG report’s findings included: TVA's analyses and its response to the NRC’s CWE letter were incomplete and inadequate; TVA's planned corrective actions are unlikely to have long-term effectiveness; precursors of the CWE went unrecognized by management; and management has inappropriately influenced the outcome of analyses and investigations pertaining to Watts Bar NSC/SCWE issues.  Staff stress, fear and trust issues also exist.

In response, TVA management pointed out the corrective actions that were taken or are underway since the first draft of the IG report was issued.  Additionally, TVA management “has expressly acknowledged management's role in creating the condition and its responsibility for correcting it."

Our Perspective

This is merely a continuation of a couple of sad stories we’ve been reporting on for a long time.  Click on the Entergy, Pilgrim, TVA or Watts Bar labels to get our earlier reports. 

The finding that Pilgrim did not adequately benchmark against industry standards is appalling. 
Entergy operates a fleet of nuclear plants and they don’t know what industry standards are?  Whatever.  Entergy is closing all the plants they purchased outside their service territory, hopefully to increase their attention on their utility-owned plants (where Arkansas Nuclear One remains a work in progress). 

We applaud the TVA IG for shining a light on the agency’s NSC issues.  In response to the IG report, TVA management put out a typical mea culpa accompanied by claims that their current corrective actions will fix the CWE and other NSC problems.  Well, their prior actions were ineffective and these actions will also probably fall short.  It doesn’t really matter.  TVA is too big to fail, both politically and economically, and their nuclear program will likely continue to plod along forever.


*  D.H. Dorman (NRC) to J. Dent (TVA), “Pilgrim Nuclear Power Station – Supplemental Inspection Report (Inspection Procedure 95003 Phase ‘C’) 05000293/2016011 and Preliminary Greater-than-Green Finding” (May 10, 2017).  ADAMS ML17129A217.

**  TVA Inspector General, “NTD Consulting Group, LLC's Assessment of TVA's Evaluation of the Chilled Work Environment at Watts Bar Nuclear Plant - 2016-16702” (April 19, 2017).  Also see D. Flessner, “TVA inspector general says safety culture problems remain at Watts Bar,” Chattanooga Times Free Press (April 21, 2017).  Retrieved May 25, 2017.

Wednesday, May 10, 2017

A Nordic Compendium on Nuclear Safety Culture

A new research paper* covers the challenges of establishing and improving nuclear safety culture (NSC) in a dynamic, i.e., project, environment.  The authors are Finnish and Swedish and it appears the problems of the Olkiluoto 3 plant inform their research interests.  Their summary and review of current NSC literature is of interest to us. 

They begin with an overall description of how organizational (and cultural) changes can occur in terms of direction, rate and scale.

Direction

Top-down (or planned) change relies on the familiar unfreeze-change-refreeze models of Kurt Lewin and Ed Schein.  Bottom-up (or emergent) change emphasizes self-organization and organizational learning.  Truly free form, unguided change leads to NSC being an emergent property of the organization.  As we know, the top-down approach is seldom, if ever, 100% effective because of frictional losses, unintended consequences or the impact of competing, emergent cultural currents.  In a nod to a systems perspective, the authors note organizational structures and behavior influence (and are influenced by) culture.

Rate

“Organizational change can also be distinguished by the rate of its occurrence, i.e, whether the change occurs abruptly or smoothly [italics added].” (p. 8)  We observe that most nuclear plants try to build on past success, hence they promote “continuous improvement” programs that don’t rattle the organization.  In contrast, a plant with major NSC problems sometimes receives shock treatment, often in the form of a new senior manager who is expected to clean things up.  New management systems and organizational structures can also cause abrupt change.

Scale

The authors identify four levels of change.  Most operating plants exhibit the least disruptive changes, called fine tuning and incremental adjustmentModular transformation attempts to change culture at the department level; corporate transformation is self-explanatory. 

The authors sound a cautionary note: “the more radical types of changes might not be easily initiated – or might not even be feasible, considering that safety culture is by nature a slowly and progressively changing phenomenon. The obvious condition where a safety-critical organization requires radical changes to its safety culture is when it is unacceptably unhealthy.” (p. 9)

Culture Change Strategies

The authors list seven specific strategies for improving NSC:

  • Change organizational structures,
  • Modify the behavior of a target group through, e.g. incentives and positive reinforcement,
  • Improve interaction and communication to build a shared culture,
  • Ensure all organizational members are committed to safety and jointly participate in its improvement,
  • Training,
  • Promote the concept and importance of NSC,
  • Recruit and select employees who will support a strong NSC.
This section includes a literature review for examples of the specific strategies.

Project Organizations

The nature of project organizations is discussed in detail including their time pressures, wide use of teams, complex tasks and a context of a temporary organization in a relatively permanent environment.  The authors observe that “in temporary organisations, the threat of prioritizing “production” over safety may occur more naturally than in permanent organizations.” (pp. 16-17)  Projects are not limited to building new plants; as we have seen, large projects (Crystal River containment penetration, SONGS steam generator replacement) can kill operating plants.

The balance of the paper covers the authors’ empirical work.

Our Perspective 


This is a useful paper because it provides a good summary of the host of approaches and methods that have been (and are being) applied in the NSC space.  That said, the authors offer no new insights into NSC practice.

Although the paper’s focus is on projects, basically new plant construction, people responsible for fixing NSC at problem plants, e.g., Watts Bar, should peruse this report for lessons they can apply that might help achieve the step function NSC improvements such plants need.


*  K.Viitanen, N. Gotcheva and C. Rollenhagen, “Safety Culture Assurance and Improvement Methods in Complex Projects – Intermediate Report from the NKS-R SC AIM” (Feb. 2017).  Thanks to Aili Hunt of the LinkedIn Nuclear Safety Culture group for publicizing this paper.