Thursday, January 5, 2012

2011 End of Year Summary

We thought we would take this opportunity to do a little rummaging around in the Google analytics and report on some of the statistics for the safetymatters blog.

The first thing that caught our attention was the big increase in page views (see chart below) for the blog this past year.  We are now averaging more than 1000 per month and we appreciate every one of the readers who visits the blog.  We hope that the increased readership reflects that the content is interesting, thought provoking and perhaps even a bit provocative.  We are pretty sure people who are interested in nuclear safety culture cannot find comparable content elsewhere.

The following table lists the top ten blog posts.  The overwhelming favorite has been the "Normalization of Deviation" post from March 10, 2010.  We have consistently commented positively on this concept introduced by Diane Vaughan in her book The Challenger Launch Decision.  Most recently Red Conner noted in his December 8, 2011 post the potential role of normalization of deviation in contributing to complacency.  This may appear to be a bit of a departure from the general concept of complacency as primarily a passive occurrence.  Red notes that the gradual and sometimes hardly perceptive acceptance of lesser standards or non-conforming results may be more insidious than a failure to challenge the status quo.  We would appreciate hearing from readers on their views of “normalization”, whether they believe it is occurring in their organizations (and if so how is it detected?) and what steps might be taken to minimize its effect.



A common denominator among a number of the popular posts is safety culture assessment, whether in the form of surveys, performance indicators, or other means to gauge the current state of an organization.  Our sense is there is a widespread appetite for approaches to measuring safety culture in some meaningful way; such interest perhaps also indicates that current methods, heavily dependent on surveys, are not meeting needs.  What is even more clear in our research is the lack of initiative by the industry and regulators to promote or fund research into this critical area.   

A final observation:  The Google stats on frequency of page views indicate two of the top three pages were the “Score Decision” pages for the two decision examples we put forward.  They each had a 100 or more views.  Unfortunately only a small percentage of the page views translated into scoring inputs for the decisions.  We’re not sure why the lack of inputs since they are anonymous and purely a matter of the reader’s judgment.  Having a larger data set from which to evaluate the decision scoring process would be very useful and we would encourage anyone who did visit but not score to reconsider.  And of course, anyone who hasn’t yet visited these examples, please do and see how you rate these actual decisions from operating nuclear plants.

Wednesday, December 21, 2011

From SCWE to Safety Culture—Time for the Soapbox

Is a satisfactory Safety Conscious Work Environment (SCWE) the same as an effective safety culture (SC)?  Absolutely not.  However, some of the reports and commentary we’ve seen on troubled facilities appear to mash the terms together.  I can’t prove it, but I suspect facilities that rely heavily on lawyers to rationalize their operations are encouraged to try to pass off SCWE as SC.  In any case, following is a review of the basic components of SC:

Safety Conscious Work Environment

An acceptable SCWE* is one where employees are encouraged and feel free to raise safety-related issues without fear of retaliation by their employer.  Note that it does not necessarily address individual employees’ knowledge of or interest in such issues.

Problem Identification and Resolution (PI&R)

PI&R is usually manifested in a facility’s corrective action program (CAP).  An acceptable CAP has a robust, transparent process for evaluating, prioritizing and resolving specific issues.  The prioritization step includes an appropriate weight for an issue’s safety-related elements.  CAP backlogs are managed to levels that employees and regulators associate with timely resolution of issues.

However, the CAP often only deals with identified issues.  Effective organizations must also anticipate problems and develop plans for addressing them.  Again, safety must have an appropriate priority.

Organizational Decision Making

The best way to evaluate an organization’s culture, including safety culture, is through an in-depth analysis of a representative sample of key decisions.  How did the decision-making process handle competing goals, set priorities, treat devil’s advocates who raised concerns about possible unfavorable outcomes, and assign resources?  Were the most qualified people involved in the decisions, regardless of their position or rank?  Note that this evaluation should not be limited to situations where the decisions led to unfavorable consequences; after all, most decisions lead to acceptable outcomes.  The question here is “How were safety concerns handled in the decision making process, independent of the outcome?”

Management Behavior

What is management’s role in all this?  Facility and corporate managers must “walk the talk” as role models demonstrating the importance of safety in all aspects of organizational life.  They must provide personal leadership that reinforces safety.  They must establish a recognition and reward system that reinforces safety.  Most importantly, they must establish and maintain the explicit and implicit weighting factors that go into all decisions.  All of these actions reinforce the desired underlying assumptions with respect to safety throughout the organization. 

Conclusion

Establishing a sound safety culture is not rocket science but it does require focus and understanding (a “mental model”) of how things work.  SCWE, PI&R, Decision Making and Management Behavior are all necessary components of safety culture.  Not to put too fine a point on it, but safety culture is a lot more than quoting a survey result that says “workers feel free to ask safety-related questions.”


*  SCWE questions have also been raised on the LinkedIn Nuclear Safety and Nuclear Safety Culture discussion forums.  Some of the commentary is simple bloviating but there are enough nuggets of fact or insight to make these forums worth following.

Thursday, December 8, 2011

Nuclear Industry Complacency: Root Causes

NRC Chairman Jaczko, addressing the recent INPO CEO conference, warned about possible increasing complacency in the nuclear industry.*  To support his point, he noted the two plants in column four of the ROP Action Matrix and two plants in column three, the increased number of special inspections in the past year, and the three units in extended shutdowns.  The Chairman then moved on to discuss other industry issues. 

The speech spurred us to ask: Why does the risk of complacency increase over time?  Given our interest in analyzing organizational processes, it should come as no surprise that we believe complacency is more complicated than the lack of safety-related incidents leading to reduced attention to safety.

An increase in complacency means that an organization’s safety culture has somehow changed.  Causes of such change include shifts in the organization’s underlying assumptions and decay.

Underlying Assumptions

We know from the Schein model that underlying assumptions are the bedrock for culture.  One can take those underlying assumptions and construct an (incomplete) mental model of the organization—what it values, how it operates and how it makes decisions.  Over time, as the organization builds an apparently successful safety record, the mental weights that people assign to decision factors can undergo a subtle but persistent shift to favor the visible production and cost goals over the inherently invisible safety factor.  At the same time, opportunities exist for corrosive issues, e.g., normalization of deviance, to attach themselves to the underlying assumptions.  Normalization of deviance can manifest anywhere, from slipping maintenance standards to a greater tolerance for increasing work backlogs.

Decay

An organization’s safety culture will inevitably decay over time absent effective maintenance.  In part this is caused by the shift in underlying assumptions.  In addition, decay results from saturation effects.  Saturation occurs because beating people over the head with either the same thing, e.g., espoused values, or too many different things, e.g., one safety program or similar intervention after another, has lower and lower marginal effectiveness over time.  That’s one reason new leaders are brought in to “problem” plants: to boost the safety culture by using a new messenger with a different version of the message, reset the decision making factor weights and clear the backlogs.

None of this is new to regular readers of this blog.  But we wanted to gather our ideas about complacency in one post.  Complacency is not some free-floating “thing,” it is an organizational trait that emerges because of multiple dynamics operating below the level of clear visibility or measurement.  

     
*  G.B. Jaczko, Prepared Remarks at the Institute of Nuclear Power Operations CEO Conference, Atlanta, GA (Nov. 10, 2011), p. 2, ADAMS Accession Number ML11318A134.

Monday, December 5, 2011

Regulatory Assessment of Safety Culture—Not Made in U.S.A.

Last February, the International Atomic Energy (IAEA) hosted a four-day meeting of regulators and licensees on safety culture.*  “The general objective of the meeting [was] to establish a common opinion on how regulatory oversight of safety culture can be developed to foster safety culture.”  In fewer words, how can the regulator oversee and assess safety culture?

While no groundbreaking new methods for evaluating a nuclear organization’s safety culture were presented, the mere fact there is a perception that oversight methods need to be developed is encouraging.  In addition, outside the U.S., it appears more likely that regulators are expected to engage in safety culture oversight if not formal regulation.

Representatives from several countries made presentations.  The NRC presentation discussed the then-current status of the effort that led to the NRC safety culture policy statement announced in June.  The presentations covering Belgium, Bulgaria, Indonesia, Romania, Switzerland and Ukraine described different efforts to include safety culture assessment into licensee evaluations.

Perhaps the most interesting material was a report on an attendee survey** administered at the start of the meeting.  The survey covered “national regulatory approaches used in the oversight of safety culture.” (p.3) 18 member states completed the survey.  Following are a few key findings:

The states were split about 50-50 between having and not having regulatory requirements related to safety culture. (p. 7)  The IAEA is encouraging regulators to get more involved in evaluating safety culture and some countries are responding to that push.

To minimize subjectivity in safety culture oversight, regulators try to use oversight practices that are transparent,  understandable, objective, predictable, and both risk-informed and performance-based. (p. 13)  This is not news but it is a good thing; it means regulators are trying to use the same standards for evaluating safety culture as they use for other licensee activities.

Licensee decision-making processes are assessed using observations of work groups, probabilistic risk analysis, and during the technical inspection. (p. 15)  This seems incomplete or even weak to us.  In-depth analysis of critical decisions is necessary to reveal the underlying assumptions (the hidden, true culture) that shape decision-making.

Challenges include the difficulty in giving an appropriate priority to safety in certain real-time decision making situations and the work pressure in achieving production targets/ keeping to the schedule of outages. (p. 16)  We have been pounding the drum about goal conflict for a long time and this survey finding simply confirms that the issue still exists.

Bottom Line

The meeting was generally consistent with our views.  Regulators and licensees need to focus on cultural artifacts, especially decisions and decision making, in the short run while trying to influence the underlying assumptions in the long run to reduce or eliminate the potential for unexpected negative outcomes.



**  A. Kerhoas, "Synthesis of Questionnaire Survey."

Wednesday, November 23, 2011

Lawyering Up

When concerns are raised about the safety culture of an organization with very significant safety responsibilities what’s one to do?  How about, bring in the lawyers.  That appears to be the news out of the Vit Plant* in Hanford, WA.  With considerable fanfare Bechtel unveiled a new website committed to their management of the vit plant.  The site provides an array of policies, articles, reports, and messages regarding safety and quality.

One of the major pieces of information on the site is a recent assessment of the state of safety culture at the vit plant.**  The conclusion of the assessment is quite positive: “Overall, we view the results from this assessment as quite strong, and similar to prior assessments conduct [sic] by the Project.” (p. 16)  The prior assessments were the 2008 and 2009 Vit Plant Opinion Surveys.

However our readers may also recall that earlier this year the Defense Nuclear Facilities Safety Board (DNFSB) issued its report that at the safety culture at the WTP plant is “flawed”.  In a previous post we quoted from the DNFSB report as follows:

“The HSS [DOE's Office of Health, Safety and Security] review of the safety culture on the WTP project 'indicates that BNI [Bechtel National Inc.] has established and implemented generally effective, formal processes for identifying, documenting, and resolving nuclear safety, quality, and technical concerns and issues raised by employees and for managing complex technical issues.'  However, the Board finds that these processes are infrequently used, not universally trusted by the WTP project staff, vulnerable to pressures caused by budget or schedule [emphasis added], and are therefore not effective.”

Thus the DNFSB clearly has a much different view of the state of safety culture at the vit plant than does DOE or Bechtel.  We note that the DNFSB report does not appear to be one of the numerous references available at the new website.  Links to the original DOE report and the recent assessment are provided.  There is also a November 17, 2011 message to all employees from Frank Russo, Project Director*** which introduces and summarizes the 2011 Opinion Survey on the project’s nuclear safety and quality culture (NSQC).  Neither the recent assessment nor the opinion survey addresses the issues raised by the DNFSB; it is as if the DNFSB review never happened.

What really caught our attention in the recent assessment is who wrote the report - a law firm.  Their assessment was based on in-depth interviews of 121 randomly selected employees using a 19 question protocol (the report states that the protocol is attached however it is not part of the web link).  But the law firm did not actually conduct the interviews - “investigators” from the BSII internal audit department did so and took notes that were then provided to the lawyers.  This may give new meaning to the concept of “defense in depth”.

The same law firm also analyzed the results from the 2011 Opinion Survey.  In the message to employees from , Russo asserts that the law firm has “substantial experience in interpreting [emphasis added] NSQC assessments”.  He goes on to say that the questions for the survey were developed by the WTP Independent Safety and Quality Culture Assessment (ISQCA) Team.  In our view, this executive level team has without question “substantial experience” in safety culture.  Supposedly the ISQCA team was tasked with assessing the site’s culture - why then did they only develop the questions and a law firm interpret the answers?  Strikes us as very odd. 

We don’t know the true state of safety culture at the vit plant and unfortunately, the work sponsored by vit plant management does little to provide such insight or to fully vet and respond to the serious deficiencies cited in the DNFSB assessment.  If we were employees at the plant we would be anxious to hear directly from the ISQCA team. 

Reading the law firm report provides little comfort.  We have commented many times about the inherent limitations of surveys and interviews to solicit attitudes and perceptions.  When the raw materials are interview notes of a small fraction of the employees, and assessed by lawyers who were not present in the interviews, we become more skeptical.  Several quotes from the report related to the Employee Concerns Program illustrate our concern.

“The overwhelming majority of interviewees have never used ECP. Only 6.5% of the interviewees surveyed had ever used the program.  [Note: this means a total of nine interviewees.] There is a major difference between the views of interviewees with no personal experience with ECP and those who have used the program: the majority of the interviewees who have not used the program have a positive impression of the program, while more than half of the interviewees who have used the program have a negative impression of it.” (p. 5, emphasis added)

Our favorite quote out of the report is the following.  “Two interviewees who commented on the [ECP] program appear to have confused it with Human Resources.” (p. 6)  One only wonders if the comments were favorable.

Eventually the report gets around to a conclusion that we probably could not say any better.  “We recognize that an interview population of nine employees who have used the ECP in the past is insufficient to draw any meaningful conclusions about the program.” (p. 17)

We’re left with the following question: Why go about an assessment of safety culture in such an obtuse manner, one that is superficial in its “interpretation” of very limited data,  laden with anecdotal material, and ultimately over reaching in its conclusions?


*  The "Vit Plant" is the common name for the Hanford Waste Treatment Plant (WTP).

**  Pillsbury Winthrop Shaw Pittman, LLP, "Assessment of a Safety Conscious Work Environment at the Hanford Waste Treatment Plant" (undated).  The report contains no information on when the interviews or analysis were performed.  Because a footnote refers to the 2009 Opinion Survey and a report addendum refers to an October, 2010 DOE report, we assume the assessment was performed in early-to-mid 2010.

*** WTP Comm, "Message from Frank: 2011 NSQC Employee Survey Results" (Nov. 17, 2011).  

Friday, November 11, 2011

The Mother of Bad Decisions?

This is not about safety culture, but it’s nuclear related and, given our recent emphasis on decision-making, we can’t pass over it without commenting.

The steam generators (SGs) were recently replaced at Crystal River 3.  This was a large and complex undertaking but SGs have been successfully replaced at many other plants.  The Crystal River project was more complicated because it required cutting an opening in the containment but this, too, has been successfully accomplished at other plants.

The other SG replacements were all managed by two prime contractors, Bechtel and the Steam Generator Team (SGT).  However, to save a few bucks, $15 million actually, Crystal River decided to manage the project themselves.  (For perspective, the target cost for the prime contractor, exclusive of incentive fee, was $73 million.)  (Franke, Exh. JF-32, p. 8)*
 
Cutting the opening resulted in delamination of the containment, basically the outer 10 inches of concrete separated from the overall 42-inch thick structure in an area near the opening.  Repairing the plant and replacement power costs are estimated at more than $2.5 billion.**  It’s not clear when the plant will be running again, if ever.

Progress Energy Florida (PEF), the plant owner, says insurance will cover most of the costs.  We’ll see.  But PEF also wants Florida ratepayers to pay.  PEF claims they “managed and executed the SGR [steam generator replacement] project in a reasonable and prudent manner. . . .”  (Franke, p. 3)

The delamination resulted from “unprecedented and unpredictable circumstances beyond PEF's control and in spite of PEF's prudent management. . . .” (Franke, p. 2)

PEF’s “root cause investigation determined that there were seven factors that contributed to the delamination. . . . These factors combined to cause the delamination during the containment opening activities in a complex interaction that was unprecedented and unpredictable.” [emphasis added]  (Franke, p. 27)***

This is an open docket, i.e., the Florida PSC has not yet determined how much, if anything, the ratepayers will have to pay.  Will the PSC believe that a Black Swan settled at the Crystal River plant?  Or is the word “hubris” more likely to come to mind?


* “Testimony & Exhibits of Jon Franke,” Fla. Public Service Commission Docket No. 100437-EI (Oct. 10, 2011).

**  I. Penn, “Cleaning up a DIY repair on Crystal River nuclear plant could cost $2.5 billion,” St. Petersburg Times via tampabay.com website (Oct. 9, 2011).  This article provides a good summary of the SG replacement project.

***  For the detail-oriented, “. . . the technical root cause of the CR3 wall delamination was the combination of: 1) tendon stresses; 2) radial stresses; 3) industry design engineering analysis inadequacies for stress concentration factors; 4) concrete strength properties; 5) concrete aggregate properties; and 6) the de-tensioning sequence and scope. . . . another factor, the process of removing the concrete itself, likely contributed to the extent of the delamination. . . .” From “Testimony & Exhibits of Garry Miller,” Fla. Public Service Commission Docket No. 100437-EI (Oct. 10, 2011), p. 5.

Wednesday, November 9, 2011

Ultimate Bonuses

Just when you think there is a lack of humor in the exposition of dry, but critical issues, such as risk management, our old friend Nicholas Taleb comes to the rescue.*  His op-ed piece in the New York Times** earlier this week has a subdued title, “End Bonuses for Bankers”, but includes some real eye-openers.  For example Taleb cites (with hardly concealed admiration) the ancient Hammurabi code which protected home owners by calling for the death of the home builder if the home collapsed and killed the owner.  Wait, I thought we were talking about bonuses, not capital punishment.

What Taleb is concerned about is that bonus systems in entities that pose systemic risks almost universally encourage behaviors that may not be consistent with the public good much less the long term health of the business entity.  In short he believes that bonuses provide an incentive to take risks.***  He states, “The asymmetric nature of the bonus (an incentive for success without a corresponding disincentive for failure) causes hidden risks to accumulate in the financial system and become a catalyst for disaster.”  Now just substitute “nuclear operations” for “the financial system”. 

Central to Taleb’s thesis is his belief that management has a large informational advantage over outside regulators and will always know more about risks being taken within their operation.  It affords management the opportunity to both take on additional risk (say to meet an incentive plan goal) and to camouflage the latent risk from regulators.

In our prior posts [here, here and here] on management incentives within the nuclear industry, we also pointed to the asymmetry of bonus metrics - the focus on operating availability and costs, the lack of metrics for safety performance, and the lack of downside incentive for failure to meet safety goals.  The concern was amplified due to the increasing magnitude of nuclear executive bonuses, both in real terms and as a percentage of total compensation. 

So what to do?  Taleb’s answer for financial institutions too big to fail is “bonuses and bailouts should never mix”; in other words, “end bonuses for bankers”.  Our answer is, “bonuses and nuclear safety culture should never mix”; “end bonuses for nuclear executives”.  Instead, gross up the compensation of nuclear executives to include the nominal level of expected bonuses.  Then let them manage nuclear operations using their best judgment to assure safety, unencumbered by conflicting incentives.


*  Taleb is best known for The Black Swan, a book focusing on the need to develop strategies, esp. financial strategies, that are robust in the face of rare and hard-to-predict events.

**  N. Taleb, “End Bonuses for Bankers,” New York Times website (Nov. 7, 2011).

*** It is widely held that the 2008 financial crisis was exacerbated, if not caused, by executives making more risky decisions than shareholders would have thought appropriate. Alan Greenspan commented: “I made a mistake in presuming that the self-interests of organizations, specifically banks and others, were such that they were best capable of protecting their own shareholders” (Testimony to Congress, quoted in A. Clark and J. Treanor, “Greenspan - I was wrong about the economy. Sort of,” The Guardian, Oct. 23, 2008). The cause is widely thought to be the use of bonuses for performance combined with limited liability.  See also J.M. Malcomson, “Do Managers with Limited Liability Take More Risky Decisions? An Information Acquisition Model”, Journal of Economics & Management Strategy, Vol. 20, Issue 1 (Spring 2011), pp. 83–120.

Friday, November 4, 2011

A Factory for Producing Decisions

The subject of this post is the compelling insights of Daniel Kahneman into issues of behavioral economics and how we think and make decisions.  Kahneman is one of the most influential thinkers of our time and a Nobel laureate.  Two links are provided for our readers who would like additional information.  One is via the McKinsey Quarterly, a video interview* done several years ago.  It runs about 17 minutes.  The second is a current review in The Atlantic** of Kahneman’s just released book, Thinking Fast and Slow.

Kahneman begins the McKinsey interview by suggesting that we think of organizations as “factories for producing decisions” and therefore, think of decisions as a product.  This seems to make a lot of sense when applied to nuclear operating organizations - they are the veritable “River Rouge” of decision factories.  What may be unusual for nuclear organizations is the large percentage of decisions that directly or indirectly include safety dimensions, dimensions that can be uncertain and/or significantly judgmental, and which often conflict with other business goals.  So nuclear organizations have to deliver two products: competitively priced megawatts and decisions that preserve adequate safety.

To Kahneman decisions as product logically raises the issue of quality control as a means to ensure the quality of decisions.  At one level quality control might focus on mistakes and ensuring that decisions avoid recurrence of mistakes.  But Kahneman sees the quality function going further into the psychology of the decision process to ensure, e.g., that the best information is available to decision makers, that the talents of the group surrounding the ultimate decision maker are being used effectively, and the presence of an unbiased decision-making environment.

He notes that there is an enormous amount of resistance within organizations to improving decision processes. People naturally feel threatened if their decisions are questioned or second guessed.  So it may be very difficult or even impossible to improve the quality of decisions if the leadership is threatened too much.  But, are there ways to avoid this?  Kahneman suggests the “premortem” (think of it as the analog to a post mortem).  When a decision is being formulated (not yet made), convene a group meeting with the following premise: It is a year from now, we have implemented the decision under consideration, it has been a complete disaster.  Have each individual write down “what happened?”

The objective of the premortem is to legitimize dissent and minimize the innate “bias toward optimism” in decision analysis.  It is based on the observation that as organizations converge toward a decision, dissent becomes progressively more difficult and costly and people who warn or dissent can be viewed as disloyal.  The premortem essentially sets up a competitive situation to see who can come up with the flaw in the plan.  In essence everyone takes on the role of dissenter.  Kahneman’s belief is that the process will yield some new insights - that may not change the decision but will lead to adjustments to make the decision more robust. 

Kahneman’s ideas about decisions resonate with our thinking that the most useful focus for nuclear safety culture is the quality of organizational decisions.  It also contrasts with a recent instance of a nuclear plant run afoul of the NRC (Browns Ferry) and now tagged with a degraded cornerstone and increased inspections.  As usual in the nuclear industry, TVA has called on an outside contractor to come in and perform a safety culture survey, to “... find out if people feel empowered to raise safety concerns….”***  It may be interesting to see how people feel, but we believe it would be far more powerful and useful to analyze a significant sample of recent organizational decisions to determine if the decisions reflect an appropriate level of concern for safety.  Feelings (perceptions) are not a substitute for what is actually occurring in the decision process. 

We have been working to develop ways to grade whether decisions support strong safety culture, including offering opportunities on this blog for readers to “score” actual plant decisions.  In addition we have highlighted the work of Constance Perin including her book, Shouldering Risks, which reveals the value of dissecting decision mechanics.  Perin’s observations about group and individual status and credibility and their implications for dissent and information sharing directly parallel Kahneman’s focus on the need to legitimize dissent.  We hope some of this thinking ultimately overcomes the current bias in nuclear organizations to reflexively turn to surveys and the inevitable retraining in safety culture principles.


*  "Daniel Kahneman on behavioral economics," McKinsey Quarterly video interview (May 2008).

** M. Popova, "The Anti-Gladwell: Kahneman's New Way to Think About Thinking," The Atlantic website (Nov. 1, 2011).

*** A. Smith, "Nuke plant inspections proceeding as planned," Athens [Ala.] News Courier website (Nov. 2, 2011).