Wednesday, November 23, 2011

Lawyering Up

When concerns are raised about the safety culture of an organization with very significant safety responsibilities what’s one to do?  How about, bring in the lawyers.  That appears to be the news out of the Vit Plant* in Hanford, WA.  With considerable fanfare Bechtel unveiled a new website committed to their management of the vit plant.  The site provides an array of policies, articles, reports, and messages regarding safety and quality.

One of the major pieces of information on the site is a recent assessment of the state of safety culture at the vit plant.**  The conclusion of the assessment is quite positive: “Overall, we view the results from this assessment as quite strong, and similar to prior assessments conduct [sic] by the Project.” (p. 16)  The prior assessments were the 2008 and 2009 Vit Plant Opinion Surveys.

However our readers may also recall that earlier this year the Defense Nuclear Facilities Safety Board (DNFSB) issued its report that at the safety culture at the WTP plant is “flawed”.  In a previous post we quoted from the DNFSB report as follows:

“The HSS [DOE's Office of Health, Safety and Security] review of the safety culture on the WTP project 'indicates that BNI [Bechtel National Inc.] has established and implemented generally effective, formal processes for identifying, documenting, and resolving nuclear safety, quality, and technical concerns and issues raised by employees and for managing complex technical issues.'  However, the Board finds that these processes are infrequently used, not universally trusted by the WTP project staff, vulnerable to pressures caused by budget or schedule [emphasis added], and are therefore not effective.”

Thus the DNFSB clearly has a much different view of the state of safety culture at the vit plant than does DOE or Bechtel.  We note that the DNFSB report does not appear to be one of the numerous references available at the new website.  Links to the original DOE report and the recent assessment are provided.  There is also a November 17, 2011 message to all employees from Frank Russo, Project Director*** which introduces and summarizes the 2011 Opinion Survey on the project’s nuclear safety and quality culture (NSQC).  Neither the recent assessment nor the opinion survey addresses the issues raised by the DNFSB; it is as if the DNFSB review never happened.

What really caught our attention in the recent assessment is who wrote the report - a law firm.  Their assessment was based on in-depth interviews of 121 randomly selected employees using a 19 question protocol (the report states that the protocol is attached however it is not part of the web link).  But the law firm did not actually conduct the interviews - “investigators” from the BSII internal audit department did so and took notes that were then provided to the lawyers.  This may give new meaning to the concept of “defense in depth”.

The same law firm also analyzed the results from the 2011 Opinion Survey.  In the message to employees from , Russo asserts that the law firm has “substantial experience in interpreting [emphasis added] NSQC assessments”.  He goes on to say that the questions for the survey were developed by the WTP Independent Safety and Quality Culture Assessment (ISQCA) Team.  In our view, this executive level team has without question “substantial experience” in safety culture.  Supposedly the ISQCA team was tasked with assessing the site’s culture - why then did they only develop the questions and a law firm interpret the answers?  Strikes us as very odd. 

We don’t know the true state of safety culture at the vit plant and unfortunately, the work sponsored by vit plant management does little to provide such insight or to fully vet and respond to the serious deficiencies cited in the DNFSB assessment.  If we were employees at the plant we would be anxious to hear directly from the ISQCA team. 

Reading the law firm report provides little comfort.  We have commented many times about the inherent limitations of surveys and interviews to solicit attitudes and perceptions.  When the raw materials are interview notes of a small fraction of the employees, and assessed by lawyers who were not present in the interviews, we become more skeptical.  Several quotes from the report related to the Employee Concerns Program illustrate our concern.

“The overwhelming majority of interviewees have never used ECP. Only 6.5% of the interviewees surveyed had ever used the program.  [Note: this means a total of nine interviewees.] There is a major difference between the views of interviewees with no personal experience with ECP and those who have used the program: the majority of the interviewees who have not used the program have a positive impression of the program, while more than half of the interviewees who have used the program have a negative impression of it.” (p. 5, emphasis added)

Our favorite quote out of the report is the following.  “Two interviewees who commented on the [ECP] program appear to have confused it with Human Resources.” (p. 6)  One only wonders if the comments were favorable.

Eventually the report gets around to a conclusion that we probably could not say any better.  “We recognize that an interview population of nine employees who have used the ECP in the past is insufficient to draw any meaningful conclusions about the program.” (p. 17)

We’re left with the following question: Why go about an assessment of safety culture in such an obtuse manner, one that is superficial in its “interpretation” of very limited data,  laden with anecdotal material, and ultimately over reaching in its conclusions?


*  The "Vit Plant" is the common name for the Hanford Waste Treatment Plant (WTP).

**  Pillsbury Winthrop Shaw Pittman, LLP, "Assessment of a Safety Conscious Work Environment at the Hanford Waste Treatment Plant" (undated).  The report contains no information on when the interviews or analysis were performed.  Because a footnote refers to the 2009 Opinion Survey and a report addendum refers to an October, 2010 DOE report, we assume the assessment was performed in early-to-mid 2010.

*** WTP Comm, "Message from Frank: 2011 NSQC Employee Survey Results" (Nov. 17, 2011).  

Friday, November 11, 2011

The Mother of Bad Decisions?

This is not about safety culture, but it’s nuclear related and, given our recent emphasis on decision-making, we can’t pass over it without commenting.

The steam generators (SGs) were recently replaced at Crystal River 3.  This was a large and complex undertaking but SGs have been successfully replaced at many other plants.  The Crystal River project was more complicated because it required cutting an opening in the containment but this, too, has been successfully accomplished at other plants.

The other SG replacements were all managed by two prime contractors, Bechtel and the Steam Generator Team (SGT).  However, to save a few bucks, $15 million actually, Crystal River decided to manage the project themselves.  (For perspective, the target cost for the prime contractor, exclusive of incentive fee, was $73 million.)  (Franke, Exh. JF-32, p. 8)*
 
Cutting the opening resulted in delamination of the containment, basically the outer 10 inches of concrete separated from the overall 42-inch thick structure in an area near the opening.  Repairing the plant and replacement power costs are estimated at more than $2.5 billion.**  It’s not clear when the plant will be running again, if ever.

Progress Energy Florida (PEF), the plant owner, says insurance will cover most of the costs.  We’ll see.  But PEF also wants Florida ratepayers to pay.  PEF claims they “managed and executed the SGR [steam generator replacement] project in a reasonable and prudent manner. . . .”  (Franke, p. 3)

The delamination resulted from “unprecedented and unpredictable circumstances beyond PEF's control and in spite of PEF's prudent management. . . .” (Franke, p. 2)

PEF’s “root cause investigation determined that there were seven factors that contributed to the delamination. . . . These factors combined to cause the delamination during the containment opening activities in a complex interaction that was unprecedented and unpredictable.” [emphasis added]  (Franke, p. 27)***

This is an open docket, i.e., the Florida PSC has not yet determined how much, if anything, the ratepayers will have to pay.  Will the PSC believe that a Black Swan settled at the Crystal River plant?  Or is the word “hubris” more likely to come to mind?


* “Testimony & Exhibits of Jon Franke,” Fla. Public Service Commission Docket No. 100437-EI (Oct. 10, 2011).

**  I. Penn, “Cleaning up a DIY repair on Crystal River nuclear plant could cost $2.5 billion,” St. Petersburg Times via tampabay.com website (Oct. 9, 2011).  This article provides a good summary of the SG replacement project.

***  For the detail-oriented, “. . . the technical root cause of the CR3 wall delamination was the combination of: 1) tendon stresses; 2) radial stresses; 3) industry design engineering analysis inadequacies for stress concentration factors; 4) concrete strength properties; 5) concrete aggregate properties; and 6) the de-tensioning sequence and scope. . . . another factor, the process of removing the concrete itself, likely contributed to the extent of the delamination. . . .” From “Testimony & Exhibits of Garry Miller,” Fla. Public Service Commission Docket No. 100437-EI (Oct. 10, 2011), p. 5.

Wednesday, November 9, 2011

Ultimate Bonuses

Just when you think there is a lack of humor in the exposition of dry, but critical issues, such as risk management, our old friend Nicholas Taleb comes to the rescue.*  His op-ed piece in the New York Times** earlier this week has a subdued title, “End Bonuses for Bankers”, but includes some real eye-openers.  For example Taleb cites (with hardly concealed admiration) the ancient Hammurabi code which protected home owners by calling for the death of the home builder if the home collapsed and killed the owner.  Wait, I thought we were talking about bonuses, not capital punishment.

What Taleb is concerned about is that bonus systems in entities that pose systemic risks almost universally encourage behaviors that may not be consistent with the public good much less the long term health of the business entity.  In short he believes that bonuses provide an incentive to take risks.***  He states, “The asymmetric nature of the bonus (an incentive for success without a corresponding disincentive for failure) causes hidden risks to accumulate in the financial system and become a catalyst for disaster.”  Now just substitute “nuclear operations” for “the financial system”. 

Central to Taleb’s thesis is his belief that management has a large informational advantage over outside regulators and will always know more about risks being taken within their operation.  It affords management the opportunity to both take on additional risk (say to meet an incentive plan goal) and to camouflage the latent risk from regulators.

In our prior posts [here, here and here] on management incentives within the nuclear industry, we also pointed to the asymmetry of bonus metrics - the focus on operating availability and costs, the lack of metrics for safety performance, and the lack of downside incentive for failure to meet safety goals.  The concern was amplified due to the increasing magnitude of nuclear executive bonuses, both in real terms and as a percentage of total compensation. 

So what to do?  Taleb’s answer for financial institutions too big to fail is “bonuses and bailouts should never mix”; in other words, “end bonuses for bankers”.  Our answer is, “bonuses and nuclear safety culture should never mix”; “end bonuses for nuclear executives”.  Instead, gross up the compensation of nuclear executives to include the nominal level of expected bonuses.  Then let them manage nuclear operations using their best judgment to assure safety, unencumbered by conflicting incentives.


*  Taleb is best known for The Black Swan, a book focusing on the need to develop strategies, esp. financial strategies, that are robust in the face of rare and hard-to-predict events.

**  N. Taleb, “End Bonuses for Bankers,” New York Times website (Nov. 7, 2011).

*** It is widely held that the 2008 financial crisis was exacerbated, if not caused, by executives making more risky decisions than shareholders would have thought appropriate. Alan Greenspan commented: “I made a mistake in presuming that the self-interests of organizations, specifically banks and others, were such that they were best capable of protecting their own shareholders” (Testimony to Congress, quoted in A. Clark and J. Treanor, “Greenspan - I was wrong about the economy. Sort of,” The Guardian, Oct. 23, 2008). The cause is widely thought to be the use of bonuses for performance combined with limited liability.  See also J.M. Malcomson, “Do Managers with Limited Liability Take More Risky Decisions? An Information Acquisition Model”, Journal of Economics & Management Strategy, Vol. 20, Issue 1 (Spring 2011), pp. 83–120.

Friday, November 4, 2011

A Factory for Producing Decisions

The subject of this post is the compelling insights of Daniel Kahneman into issues of behavioral economics and how we think and make decisions.  Kahneman is one of the most influential thinkers of our time and a Nobel laureate.  Two links are provided for our readers who would like additional information.  One is via the McKinsey Quarterly, a video interview* done several years ago.  It runs about 17 minutes.  The second is a current review in The Atlantic** of Kahneman’s just released book, Thinking Fast and Slow.

Kahneman begins the McKinsey interview by suggesting that we think of organizations as “factories for producing decisions” and therefore, think of decisions as a product.  This seems to make a lot of sense when applied to nuclear operating organizations - they are the veritable “River Rouge” of decision factories.  What may be unusual for nuclear organizations is the large percentage of decisions that directly or indirectly include safety dimensions, dimensions that can be uncertain and/or significantly judgmental, and which often conflict with other business goals.  So nuclear organizations have to deliver two products: competitively priced megawatts and decisions that preserve adequate safety.

To Kahneman decisions as product logically raises the issue of quality control as a means to ensure the quality of decisions.  At one level quality control might focus on mistakes and ensuring that decisions avoid recurrence of mistakes.  But Kahneman sees the quality function going further into the psychology of the decision process to ensure, e.g., that the best information is available to decision makers, that the talents of the group surrounding the ultimate decision maker are being used effectively, and the presence of an unbiased decision-making environment.

He notes that there is an enormous amount of resistance within organizations to improving decision processes. People naturally feel threatened if their decisions are questioned or second guessed.  So it may be very difficult or even impossible to improve the quality of decisions if the leadership is threatened too much.  But, are there ways to avoid this?  Kahneman suggests the “premortem” (think of it as the analog to a post mortem).  When a decision is being formulated (not yet made), convene a group meeting with the following premise: It is a year from now, we have implemented the decision under consideration, it has been a complete disaster.  Have each individual write down “what happened?”

The objective of the premortem is to legitimize dissent and minimize the innate “bias toward optimism” in decision analysis.  It is based on the observation that as organizations converge toward a decision, dissent becomes progressively more difficult and costly and people who warn or dissent can be viewed as disloyal.  The premortem essentially sets up a competitive situation to see who can come up with the flaw in the plan.  In essence everyone takes on the role of dissenter.  Kahneman’s belief is that the process will yield some new insights - that may not change the decision but will lead to adjustments to make the decision more robust. 

Kahneman’s ideas about decisions resonate with our thinking that the most useful focus for nuclear safety culture is the quality of organizational decisions.  It also contrasts with a recent instance of a nuclear plant run afoul of the NRC (Browns Ferry) and now tagged with a degraded cornerstone and increased inspections.  As usual in the nuclear industry, TVA has called on an outside contractor to come in and perform a safety culture survey, to “... find out if people feel empowered to raise safety concerns….”***  It may be interesting to see how people feel, but we believe it would be far more powerful and useful to analyze a significant sample of recent organizational decisions to determine if the decisions reflect an appropriate level of concern for safety.  Feelings (perceptions) are not a substitute for what is actually occurring in the decision process. 

We have been working to develop ways to grade whether decisions support strong safety culture, including offering opportunities on this blog for readers to “score” actual plant decisions.  In addition we have highlighted the work of Constance Perin including her book, Shouldering Risks, which reveals the value of dissecting decision mechanics.  Perin’s observations about group and individual status and credibility and their implications for dissent and information sharing directly parallel Kahneman’s focus on the need to legitimize dissent.  We hope some of this thinking ultimately overcomes the current bias in nuclear organizations to reflexively turn to surveys and the inevitable retraining in safety culture principles.


*  "Daniel Kahneman on behavioral economics," McKinsey Quarterly video interview (May 2008).

** M. Popova, "The Anti-Gladwell: Kahneman's New Way to Think About Thinking," The Atlantic website (Nov. 1, 2011).

*** A. Smith, "Nuke plant inspections proceeding as planned," Athens [Ala.] News Courier website (Nov. 2, 2011).