Saturday, June 29, 2013

Timely Safety Culture Research

In this post we highlight the doctoral thesis paper of Antti Piirto, “Safe Operation of Nuclear Power Plants – Is Safety Culture an Adequate Management Method?”*  One reason for our interest is the author’s significant background in nuclear operations.**  Thus his paper has academic weight but is informed by direct management experience.

It would be impossible to credibly summarize all of the material and insights from this paper as it covers a wide swath of safety management and culture and associated research.  The pdf is 164 pages.  In this post we will provide an overview of the material with pointers to some aspects that seem most interesting to us.


The paper is developed from Piirto’s view that “Today there is universal acceptance of the significant impact that management and organisational factors have over the safety significance of complex industrial installations such as nuclear power plants. Many events with significant economic and public impact had causes that have been traced to management deficiencies.” (p. i)  It provides a comprehensive and useful overview of the development of safety management and safety culture thinking and methods, noting that all too often efforts to enhance safety are reactive.

“For many years it has been considered that managing a nuclear power plant was mostly a matter of high technical competence and basic managerial skills.” (p. 3)  And we would add, in many quarters there is a belief that safety management and culture simply flow from management “leadership”.  While leadership is an important ingredient in any management system, its inherent fuzziness leaves a significant gap in efforts to systematize methods and tools to enhance performance outcomes.  Again citing Piirto, safety culture is “especially vague to those carrying out practical safety work. Those involved...require explanations concerning how safety culture will alter their work” (p. 4)

Piirto also cites the prevalence in the nuclear industry of “unilateral thinking” and the lack of exposure to external criticism of current nuclear management approaches, accompanied by “homogeneous managerial rhetoric”. (p. 4)

“Safety management at nuclear power plants needs to become more transparent in order to enable us to ensure that issues are managed correctly.” (p. 6)  “Documented safety thinking provides the organisation with a common starting point for future development.” (p. 8)  Transparency and the documentation (preservation) of safety thinking resonates with us.  When forensic efforts have been made to dissect safety thinking (e.g., see Perin’s book Shouldering Risks) it is apparent how illuminating and educational such information can be.

Culture as Control Mechanism

Piirto describes organizational culture as…”a socially constructed, unseen, and unobservable force behind organisational activities.” (p. 13)  “It functions as an organisational control mechanism, informally approving or prohibiting behaviours.” (p. 14)

We would say that in terms of a control mechanism, culture’s effect should be clarified as being one of perhaps many mechanisms that ultimately combine to determine actual behavior.  In our conceptual model safety culture specifically can be thought of as a resistance to other non-safety pressures affecting people and their actions.  (See our post dated June 29, 2012.)  Piirto calls culture a “powerful lever” for guiding behavior. (p. 15)  The stronger the resident safety culture is the more leverage it has to keep in check other pressures.  However it is also almost inevitable that there can be some amount of non-safety pressure that compromises the control leverage of safety culture and perhaps leads to undesired outcomes.

Some of Piirto’s most useful insights can be found on p. 14 where he explains that culture at its essence is “a concept rather than a thing” - and a concept created in people’s minds.  We like the term “mental model” as well.  He goes on to caution that we must remember that culture is not just a set of structural elements or constructs - “It also is a dynamic process – a social construction that is undergoing continual reconstruction.”  Perhaps another way of saying this is to realize that culture cannot be understood apart from its application within an organization.  We think this is a significant weakness of culture surveys that tend to ask questions in the abstract, e.g., “Is safety a high priority?”, versus exploring precisely how safety priorities are exercised in specific decisions and actions of the organization.

Piirto reviews various anthropologic and sociologic theories of culture including debate about whether culture is a dependent or independent variable (p.18), the origins of safety culture, and culture surveys. (pp. 23-24)

Some other interesting content can be found starting at Section 2.2.7 (p. 29) where Piirto reviews approaches to the assessment of safety culture, which really amounts to - what is the practical reality associated with a culture.  He notes “the correlation between general preferences and specific behaviour is rather modest.” and “The Situational Approach suggests that the emphasis should be put on collecting data on actual practices, real dilemmas and decisions (what is also called “theories in use”) rather than on social norms.” (p. 29)

Knowledge Management and Training

Starting on p. 39 is a very useful discussion of Knowledge Management including its inherently dynamic nature.  Knowledge Management is seen as being at the heart of decision making and in assessing options for action.

In terms of theories of how people behave, there are two types, “...the espoused theory, or how people say they act, and the theory-in-use, or how people actually act. The espoused theory is easier to understand. It describes what people think and believe and how they say they act. It is often on a conscious level and can be easily changed by new ideas and information. However, it is difficult to be aware of the theory-in-use, and it is difficult to change...” (p. 46)

At this juncture we would like to have seen a closer connection between the discussions of Knowledge Management and safety management.  True, ensuring that individuals have the benefit of preserving, sustaining and increasing knowledge is important, but how exactly does that get reflected in safety management performance?  Piirto does draw an analogy between systematic approaches to training and proposes that a similar approach would benefit safety management, by documenting how safety is related to practical work.  “This would turn safety culture into a concrete tool. Documented safety thinking provides the organisation with a common starting point for future development.” (p. 61)

One way to document safety thinking is through event investigation.  Piirto observes, “Event investigation is generally an efficient starting point for revealing the complex nature of safety management. The context of events reveals the complex interaction between people and technology in an organisational and cultural context. Event investigations should not only focus on events with high consequences; in most complex events a through investigation will reveal basic causes of great interest, particularly at the safety management level. Scientific studies of event investigation techniques and general descriptions of experience feedback processes have had a tendency to regard event investigations as too separated from a broader safety management context.”  (p. 113)

In the last sections of the paper Piirto summarizes the results of several research projects involving training and assessment of training effectiveness, knowledge management and organizational learning.  Generally these involve the development and training of shift personnel.

Take Away

Ultimately I’m not sure that the paper provides a simple answer to the question posed in its title: Is safety culture an adequate management method?  Purists would probably observe that safety culture is not a management method; on the other hand I think it is hard to ignore the reliance being placed by regulatory bodies on safety culture to help assure safety performance.  And much of this reliance is grounded in an “espoused theory” of behavior rather than a systematic, structured and documented understanding of actual behaviors and associated safety thinking.  Such “theory in use” findings would appear to be critical in connecting expectations for values and beliefs to actual outcomes.  Perhaps the best lesson offered in the paper is that there needs to be a much better overall theory of safety management that links cultural, knowledge management and training elements.

*  A. Piirto,  “Safe Operation of Nuclear Power Plants – Is Safety Culture an Adequate Management Method?” thesis for the degree of Doctor of Science in Technology (Tampere, Finland: Tampere Univ. of Technology, 2012).

**  Piirto has a total of 36 years in different management and supervision tasks in a nuclear power plant organization, including twelve years as the Manager of Operation for the Olkiluoto nuclear power plant.

Wednesday, June 26, 2013

Dynamic Interactive Training

The words dynamic and interactive always catch our attention as they are intrinsic to our world view of nuclear safety culture learning.  Carlo Rusconi’s presentation* at the recent IAEA International Experts’ Meeting on Human and Organizational Factors in Nuclear Safety in the Light of the Accident at the Fukushima Daiichi Nuclear Power Plant in Vienna in May 2013 is the source of our interest.

While much of the training described in the presentation appeared to be oriented to the worker level and the identification of workplace type hazards and risks, it clearly has implications for supervisory and management levels as well.

In the first part of the training students are asked to identify and characterize safety risks associated with workplace images.  For each risk they assign an index based on perceived likelihood and severity.  We like the parallel to our proposed approach for scoring decisions according to safety significance and uncertainty.**

“...the second part of the course is focused on developing skills to look in depth at events that highlight the need to have a deeper and wider vision of safety, grasping the explicit and implicit connections among technological, social, human and organizational features. In a nutshell: a systemic vision.” (slide 13, emphasis added)  As part of the training students are exposed to the concepts of complexity, feedback and internal dynamics of a socio-technical system.  As the author notes, “The assessment of culture within an organization requires in-depth knowledge of its internal dynamics”.  (slide 15)

This part of the training is described as a “simulation” as it provides the opportunity for students to simulate the performance of an investigation into the causes of an actual event.  Students are organized into three groups of five persons to gain the benefit of collective analysis within each group followed by sharing of results across groups.  We see this as particularly valuable as it helps build common mental models and facilitates integration across individuals.  Last, the training session takes the student’s results and compares them to the outcomes from a panel of experts.  Again we see a distinct parallel to our concept of having senior management within the nuclear organization pre-analyze safety issues to establish reference values for safety significance, uncertainty and preferred decisions.  This provides the basis to compare trainee outcomes for the same issues and ultimately to foster alignment within the organization.

Thank you Dr. Rusconi.

*  C. Rusconi, “Interactive training: A methodology for improving Safety Culture,” IAEA International Experts’ Meeting on Human and Organizational Factors in Nuclear Safety in the Light of the Accident at the Fukushima Daiichi Nuclear Power Plant, Vienna May 21-24, 2013.

**  See our blog posts dated April 9 and June 6, 2013.  We also remind readers of Taleb’s dictate to decision makers to focus on consequences versus probability in our post dated June 18, 2013.

Tuesday, June 25, 2013

Regulatory Creep

The NRC's assessment of safety culture (SC) is an example of regulatory creep.  It began with the requirement that licensees determine whether specific safety-related performance problems or cross-cutting issues were caused, in whole or in part, by SC deficiencies.  Then the 2011 SC Policy Statement attempted to put a benign face on NRC intrusiveness because a policy statement is not a regulation.  However, licensees are “expected” to comply with the policy statement's goals and guidance; the NRC “expectations” become de facto regulations.

We have griped about this many times.*  But why does regulatory creep occur?  Is it inevitable?  We'll start with some background then look at some causes.

In the U.S., Congress passes and the President approves major legislative acts.  These are top-level policy statements characterized by lofty goals and guiding principles.  Establishing the detailed rules (which have the force of law) for implementing these policies falls to government bureaucrats in regulatory agencies.  There are upwards of 50 such agencies in the federal government, some part of executive branch departments (headed by a Cabinet level officer), others functioning independently, i.e., reporting to Congress with the President appointing, subject to Congressional approval, their governing boards (commissioners).  The NRC is one of the independent federal regulatory agencies.

Regulatory rules are proposed and approved following a specified, public process.  But once they are in place, multiple forces can lead to the promulgation of new rules or an expanded interpretation or application of existing rules (creep).  The forces for change can arise internal or external to the agency.  Internal forces include the perceived need to address new real or imagined issues, a fear of losing control as the regulated entities adapt and evolve, or a generalized drive to expand regulatory authority.  Even bureaucrats can have a need for more power or a larger budget.

External sources include interest groups (and their lobbyists), members of Congress who serve on oversight committees, highly motivated members of the public or the agency's own commissioners.  We classify commissioners as external because they are not really part of an agency; they are political appointees of the President, who has a policy agenda.  In addition, a commissioner may owe a debt or allegiance to a Congressional sponsor who promoted the commissioner's appointment.

Given all the internal and external forces, it appears that new rules and  regulatory creep are inevitable absent the complete capture of the agency by its nominally regulated entities.  Creep means a shifting boundary of what is required, what is allowed, what is tolerated and what will be punished—without a formal rule making.  The impact of creep on the regulated entities is clear: increased uncertainty and cost.  They may not care for increased regulatory intrusiveness but they know the penalty may be high if they fail to comply.  When regulated entities perceive creep, they must make a business decision: comply or fight.  They often choose to comply simply because if they fight and lose, they risk even more punitive formal regulation and higher costs.  If they fight and win, they risk alienating career bureaucrats who will then wait for an opportunity to exact retribution.  A classic lose-lose situation.  

Our perspective

Years ago I took a poli-sci seminar where the professor said public policy forces could be boiled down to: Who's mad?  How mad?  And who's glad?  How glad?  I sometimes refer to that simple mental model when I watch the ongoing Kabuki between the regulator, its regulated entities and many, many political actors.  Regulatory creep is one of the outcomes of such dynamics.

*  For related posts, click the "Regulation of Safety Culture" label.

Regulatory creep is not confined to the NRC.  The motivation for this post was an item forwarded by a reader on reported Consumer Product Safety Commission (CPSC) activity.  Commenting on a recent settlement, a CPSC Commissioner “expressed concern that . . . the CPSC had insisted on a comprehensive compliance program absent evidence of widespread noncompliance and that “the compliance program language in [the] settlement is another step toward just such a de facto rule.””  C.G. Thompson, “Mandated Compliance Programs as the New Normal?” American Conference Institute blog.  Retrieved June 6, 2013.

Tuesday, June 18, 2013

The Incredible Shrinking Nuclear Industry

News last week that the San Onofre units would permanently shutdown - joining Crystal River 3 (CR3) and Kewaunee as the latest early retirees - and filling in the last leg of a nuclear bad news trifecta.  This is distressing on many fronts, not the least of which is the loss of jobs for thousands of highly qualified nuclear personnel, and perhaps the suggestion of a larger trend.  Almost as distressing is the characterization by NEI that San Onofre is a unique situation - as were CR3 and Kewaunee by the way - and placing primary blame on the NRC.*  Really?  The more useful question to ponder is what decisions led up to the need for plant closures and whether there is a common denominator? 

We can think of one: decisions that failed to adequately account for the “tail” of the risk distribution where outcomes, albeit of low probability, carry high consequences.  On this score checking in with Nick Taleb is always instructive.  He observes “This idea that in order to make a decision you need to focus on the consequences (which you can know) rather than the probability (which you can’t know) is the central idea of uncertainty.”**
  • For Kewaunee the decision to purchase the plant with a power purchase agreement (PPA) that extended only for eight years;
  • For CR3, the decision to undertake cutting the containment with in-house expertise;
  • For SONGs the decision to purchase and install new design steam generators from a vendor working beyond its historical experience envelope.
Whether the decision makers understood this, or even imagined that their decisions included the potential to lose the plants, the results speak for themselves.  These people were in Black Swan and fat tail territory and didn’t realize it.  Let’s look at a few details.


Many commentators at this point are writing off the Kewaunee retirement based on the miracle of low gas prices.  Dominion cites gas prices and the inability to acquire additional nuclear units in the upper Midwest to achieve economies of scale.  But there is a far greater misstep in the story.  When Dominion purchased Kewaunee from Wisconsin Public Service in 2005, a PPA was included as part of the transaction.  This is an expected and necessary part of the transaction as it established set prices for the sale of the plant’s output for a period of time.  A key consideration in structuring deals such as this is not only the specific pricing terms for the asset and the PPA, but the duration of the PPA.  In the case of Kewaunee the PPA ran for only 8 years, through December 2013.  After 8 years Dominion would have to negotiate another PPA with the local utilities or others or sell into the market.  The question is - when buying an asset with a useful life of 28 years (with grant of the 20 year license extension), why would Dominion be OK with just an 8 year PPA?  Perhaps Dominion assumed that market prices would be higher in 8 years and wanted to capitalize on those higher prices.  Opponents to the transaction believed this to be the case.***  The prevailing expectation at the time was that demand would continue along with appropriate pricing necessary to accommodate current and planned generating units.  But the economic downturn capped demand and left a surplus of baseload.  Local utilities faced with the option of negotiating a PPA for Kewaunee - or thinning the field and protecting their own assets - did what was in their interest. 

The reality is that Dominion rolled the dice on future power prices.  Interestingly, in the same time frame, 2007, the Point Beach units were purchased by NextEra Energy Resources (formerly FPL Energy).  In this transaction PPAs were negotiated through the end of the extended license terms of the units, 2030 and 2033, providing the basis for a continuing and productive future.

Crystal River 3

In 2009 Progress Energy undertook a project to replace the steam generators in CR3.  As with some other nuclear plants this necessitated cutting into the containment to allow removal of the old generators and placement of the new. 

Apparently just two companies, Bechtel and SGT, had managed all the previous 34 steam generator replacement projects at U.S. nuclear power plants. Of those, at least 13 had involved cutting into the containment building. All 34 projects were successful.

For the management portion of the job, Progress got bids from both Bechtel and SGT. The lowest was from SGT but Progress opted to self-manage the project to save an estimated $15 million.  During the containment cutting process delamination of concrete occurred in several places.  Subsequently an outside engineering firm hired to do the failure analysis stated that cutting the steel tensioning bands in the sequence done by Progress Energy along with removing of the concrete had caused the containment building to crack.  Progress Energy disagreed stating the cracks “could not have been predicted”.  (See Taleb’s view on uncertainty above.)

“Last year, the PSC endorsed a settlement agreement that let Progress Energy refund $288 million to customers in exchange for ending a public investigation of how the utility broke the nuclear plant.”****

When it came time to assess how to fix the damage, Progress Energy took a far more conservative and comprehensive approach.  They engaged multiple outside consultants and evaluated numerous possible repair options.  After Duke Energy acquired Progress, Duke engaged an independent, third-party review of the engineering and construction plan developed by Progress.  The independent review suggested that the cost was likely to be almost $1.5 billion. However, in the worst-case scenario, it could cost almost $3.5 billion and take eight years to complete.   “...the [independent consultant] report concluded that the current repair plan ‘appears to be technically feasible, but significant risks and technical issues still need to be resolved, including the ultimate scope of any repair work.’"*****  Ultimately consideration of the potentially huge cost and schedule consequences caused Duke to pull the plug.  Taleb would approve.

San Onofre

Southern California Edison undertook a project to replace its steam generators almost 10 years ago.  It decided to contract with Mitsubishi Heavy Industries (MHI) to design and construct the generators.  This would be new territory for Mitsubishi in terms of the size of the generators and design complexity.  Following installation and operation for a period of time, tube leakage occurred due to excessive vibrations.  The NRC determined that the problems in the steam generators were associated with errors in MHI's computer modeling, which led to underestimation of thermal hydraulic conditions in the generators.

“Success in developing a new and larger steam generator design requires a full understanding of the risks inherent in this process and putting in place measures to manage these risks….Based upon these observations, I am concerned that there is the potential that design flaws could be inadvertently introduced into the steam generator design that will lead to unacceptable consequences (e.g., tube wear and eventually tube plugging). This would be a disastrous outcome for both of us and a result each of our companies desire to avoid. In evaluating this concern, it would appear that one way to avoid this outcome is to ensure that relevant experience in designing larger sized steam generators be utilized. It is my understanding the Mitsubishi Heavy Industries is considering the use of Westinghouse in several areas related to scaling up of your current steam generator design (as noted above). I applaud your effort in this regard and endorse your attempt to draw upon the expertise of other individuals and company's to improve the likelihood of a successful outcome for this project.”#

Unfortunately these concerns raised by SCE came after letting the contract to Mitsubishi.  SCE placed (all of) its hopes on improving the likelihood of a successful outcome at the same time stating that a design flaw would be “disastrous”.  They were right about the disaster part.

Take Away

These are cautionary tales on a significant scale.  Delving into how such high risk (technical and financial) decisions were made and turned out so badly could provide useful lessons learned.  That doesn’t appear likely given the interests of the parties and being inconsistent with the industry predicate of operational excellence.

With regard to our subject of interest, safety culture, the dynamics of safety decisions are subject to similar issues and bear directly on safety outcomes.  Recall that in our recent posts on implementing safety culture policy, we proposed a scoring system for decisions that includes the safety significance and uncertainty associated with the issue under consideration.  The analog to Taleb’s “central idea of uncertainty” is intentional and necessary.  Taleb argues you can’t know the probability of consequences.  We don’t disagree but as a “known unknown” we think it is useful for decision makers to recognize how uncertain the significance (consequences) may be and calibrate their decision accordingly.

*  “Of course, it’s regrettable...Crystal River is closing, the reasons are easy to grasp, and they are unique to the plant. Even San Onofre, which has also been closed for technical reasons (steam generator problems there), is quite different in specifics and probable outcome. So – unfortunate, yes; a dire pox upon the industry, not so much.”  NEI Nuclear Notes (Feb. 7, 2013).  Retrieved June 17, 2013.  For the NEI/SCE perspective on regulatory foot-dragging and uncertainty, see W. Freebairn et al, "SoCal Ed to retire San Onofre nuclear units, blames NRC delays," Platts (June 7, 2013).  Retrieved June 17, 2013.  And "NEI's Peterson discusses politics surrounding NRC confirmation, San Onofre closure," Environment & Energy Publishing OnPoint (June 17, 2013).  Retrieved June 17, 2013.

**  N. Taleb, The Black Swan (New York: Random House, 2007), p. 211.  See also our post on Taleb dated Nov. 9, 2011.

***  The Customers First coalition that opposed the sale of the plant in 2004 argued: “Until 2013, a complex purchased-power agreement subject to federal jurisdiction will replace PSCW review. After 2013, the plant’s output will be sold at prices that are likely to substantially exceed cost.”  Customers First!, "Statement of Position: Proposed Sale of the Kewaunee Nuclear Power Plant April 2004" (April, 2004).  Retrieved June 17, 2013.

****  R. Trigaux, "Who's to blame for the early demise of Crystal River nuclear power plant?" Tampa Bay Times (Feb. 5, 2013).  Retrieved Jun 17, 2013.  We posted on CR3's blunder and unfolding financial mess on Nov. 11, 2011.

*****  "Costly estimates for Crystal River repairs," World Nuclear News (Oct. 2, 2012).  Retrieved June 17, 2013.

#  D.E. Nunn (SCE) to A. Sawa (Mitsubishi), "Replacement Steam Generators San Onofre Nuclear Generating Station, Units 2 & 3" (Nov. 30, 2004).  Copy retrieved June 17, 2013 from U.S. Senate Committee on Environment & Public Works, attachment to Sen. Boxer's May 28, 2013 press release.

Friday, June 14, 2013

Meanwhile, Back at the Vit Plant

Previous posts* have chronicled the safety culture (SC) issues raised at the Waste Treatment and Immobilization Plant (WTP aka the Vit plant) at the Department of Energy's (DOE's) Hanford site.  Both the DOE Office of River Protection (ORP) and the WTP contractor (Bechtel) have been under the gun to strengthen their SC.  On May 30, 2013 DOE submitted a progress report** to the Defense Nuclear Facilities Safety Board covering both DOE and Bechtel activities.


Based on an assessment by an internal SC Integrated Project Team (IPT), ORP reported its progress on nine near-term SC improvement actions contained in the ORP SC Improvement Plan.  For each action, the IPT assessed degree of implementation (full, partial or none) and effectiveness (full, partial, or indeterminate).  The following table summarizes the actions and current status.

ORP has a lot of activities going on but only two are fully implemented and none is yet claimed to be fully effective.  In ORP's own words, “ORP made a substantial start toward improving its safety culture, but much remains to be done to demonstrate effective change. . . . Four of the nine actions were judged to be partially effective, and the other five were judged to be of indeterminate effectiveness at the time of evaluation due to the recent completion of some of the actions, and because of the difficulty in measuring safety culture change over a one-year time period.” (Smith, p. 1)

The top-level ORP actions look substantive but digging into the implementation details reveals many familiar tactics for addressing SC problems: lots of training (some yet to be implemented), new or updated processes and procedures, (incomplete) distribution of INPO booklets, and the creation of a new behavioral expectations poster (which is largely ignored).

SC elements have been added to senior management and supervisor performance plans.  That appears to mean these folks are supposed to periodically discuss SC with their people.  There's no indication whether such behavior will be included in performance review or compensation considerations.

ORP did attempt to address concerns with the Differing Professional Opinion (DPO) process.  DPO and Employee Concerns Program (ECP) training was conducted but some employees reported reservations about both programs.

A new issues management system has been well received by employees but needs greater promotion by senior managers to increase employees' willingness to raise issues and ask questions.  The revised ECP also needs increased senior management support.

The team pointed out that ORP does not have a SC management statement or policy.


There is much less detail available here.  The report says Bechtel's plan “contains 50 actions broken into six strategic improvement areas:

A. Realignment and Maintenance of Design and Safety Basis
B. Management Processes of the WTP NSQC
C. Timeliness of Issues Identification
D. Resolution. Roles. Responsibilities. Authorities, and Accountabilities
E. Management and Supervisory Behaviors
F.  Construction Site-Unique Issues

“The scheduled completion date for the last actions is December 2013. Twenty-seven actions were complete as of March 31, 2013, with an additional 12 planned to be complete by June 30, 2013.” (p. 19)

“ORP has completed surveillances on 19 of the 27 completed actions identifying 7 opportunities for improvement.  Because changing an organization's culture takes time, the current oversight efforts are focused on verifying actions have been completed.” (ibid.)  In other words, there has been no evaluation of the effectiveness of Bechtel's actions.

Our perspective

The ORP program is a traditional approach aimed at incremental organizational performance improvement.  There is no or scant mention of what we'd call strategic concerns, e.g., recognizing and addressing schedule/budget/safety goal conflicts; decision making in a complex, dynamic environment with many external pressures; riding herd on Bechtel; or creating a sense of urgency with respect to SC.

The most surprising thing to us was how unexpectedly candid the assessment was (for one produced by an employee team) in describing the program's impact to date.  For example, as the IPT performed its assessment, it tried to determine if employees were aware of the SC actions or their effects.  The results were mixed: some employees see changes but many don't, or they sense a general change but are unaware of specifics, e.g., new or changed procedures.  In general, organizational emphasis on SC declined over the year and was not very visible to the average employee.

The team's most poignant item was a direct appeal for personal involvement
by the ORP manager in the SC program.  That tells you everything you need to know about SC's priority at ORP.

*  Click on Vit Plant under Labels to see previous posts.

**  M. Moury (DOE) to P.S. Winokur (DNFSB), DOE completes Action 1-9 of the Department's Implementation Plan for DNFSB Recommendation 2011-1, Safety Culture at the Waste Treatment and Immobilization Plant (May 30, 2013).  A status summary memo from ORP's K.W. Smith and the IPT report are attached to the Moury letter.  Our thanks to Bill Mullins for bringing these documents to our attention.

Wednesday, June 12, 2013

McKinsey Quarterly Report on Decision Making Styles

A brief article* in the April McKinsey Quarterly describes a piece of early-stage academic research into different individuals' decision making styles at work.

This is not rigorous social science.  The 5,000 survey participants were self-selected readers of the McKinsey Quarterly and Harvard Business Review.  Survey responses showed a range of decision making preferences, from largely intuitive to exhaustive deliberation.  Further analysis identified five different types of decision-makers.  Each type has exposure to certain decision making risks based on the decision-maker's preference for say, moving ahead quickly vs. lengthy analysis.  In other words, each type exhibits certain biases.

A practical application of this typology is to see which type best describes two very important people: you and your boss.  Self assessment is always valuable to identify current strengths and improvement opportunities.  Boss assessment may reveal why your boss sees things differently from you, and suggest ways you can support and complement your boss to help you both become more successful at work.

*  D. Lovallo and O. Sibony, “Early-stage research on decision-making styles,” McKinsey Quarterly (April 2013).  Retrieved June 11, 2013.  A pop-out button is on the right side of the text, about half-way down the article; pushing the button opens a slide show of the different decision making types.  A pdf of the article can be downloaded if one registers (free) at the site.

Thursday, June 6, 2013

Implementing Safety Culture Policy Part 2

This post continues our discussion of the implementation of safety culture policy in day-to-day nuclear management decision making, started in our post dated April 9, 2013.   In that post we introduced several parameters for quantitatively scoring decisions: decision quality, safety significance and significance uncertainty.  At this time we want to update the decision quality label, using instead “decision balance”.

To illustrate the application of the scoring method we used a set of twenty decisions based on issues taken from actual U.S. nuclear operating experience, typically those that were reported in LERs.  As a baseline, we scored each issue for safety significance and uncertainty.  Each issue identified 3 to 4 decision options for addressing the problem - and each option was annotated with the potential impacts of the decision on budgets, generation (e.g. potential outage time) and the corrective action program.   We scored each decision option for its decision balance (how well the decision option balances safety priority) and then identified the preferred decision option for each issue.  This constitutes what we refer to as the “preferred decision set”.  A pdf file of one example issue with decision choices and scoring inputs is available here

Our assumption is that the preferred decision set would be established/approved by senior management based on their interpretation of the issues and their expectations for how organizational decisions should reflect safety culture.  The set of issues would then be used in a training environment for appropriate personnel.  For purposes of this example, we incorporated the preferred decision set into our NuclearSafetySim* simulator to illustrate the possible training experience.  The sim provides an overall operational context tracking performance for cost, plant generation and CAP program and incorporating performance goals and policies.

Chart 1
In the sim application a trainee would be tasked with assessing an issue every three months over a 60 month operational period.  The trainee would do this while attempting to manage performance results to achieve specified goals.  For each issue the trainee would review the issue facts, assign values for significance and uncertainty, and select a decision option.  Chart 1 compares the actual decisions (those by the trainee) to those in the preferred set for our prototype session.   Note that approximately 40% of the time the actual decision matched the preferred decision (orange data points).  For the remainder of the issues the trainee’s selected decisions differed.  Determining and understanding why the differences occurred is one way to gain insight into how culture manifests in management actions.

As we indicated in the April 9 post, each decision is evaluated for its safety significance and uncertainty in accordance with quantified scales.  These serve as key inputs to determining the appropriate balance to be achieved in the decision.  In prior work in this area, reported in our posts dated July 15, 2011 and October 14, 2011 we solicited readers to score two issues for safety significance.  The reported scores ranged from 2 to 10 (most scores between 4 to 6) for one issue and ranged 5 to 10 (most scores 6 to 8) for the other issue.  This reflects the reality that perceptions of safety significance are subject to individual differences.  In the current exercise, similar variations in scoring were expected and led to differences between the trainee’s scores and the preferred decision set.  The variation may be due to the inherent subjective nature of assessing these attributes and other factors such as experience, expertise, biases, and interpretations of the issue.  So this could be one source of difference in the trainee decision selections versus the preferred set, as the decision process attempts to match action to significance. 

Another source could be in the decision options themselves.   The decision choice by a trainee could have focused on what the trainee felt was the “best” (i.e., most efficacious) decision versus an explicit consideration of safety priority commensurate with safety significance.  Additionally decision choices may have been influenced by their potential impacts, particularly under conditions where performance was not on track to meet goals. 

Chart 2
Taking this analysis a bit further, we looked at how decision balance varied over the course of the simulation.  As discussed in our April 9 post we use decision balance to create a quantitative measure of how well the goal of safety culture is being incorporated in a specific decision - the extent to which the decision accords the priority for safety commensurate with its safety significance.  In the instant exercise, each decision option for each issue has been assigned a balance value as part of the preferred scoresheet.**  Chart 2 shows a timeline of decision balances - one for the preferred decision set and the other for the actual decisions made by the trainee.  A smoothing function has been applied to the discrete values of balance to provide a continuous track. 

The plots illustrate how decision balance may vary over time, with specific decisions reflecting greater or lesser emphasis on safety.  During the first half of the sim the decision balances are in fairly close agreement, reflecting in part that in 5 of 8 cases the actual decisions matched the preferred decisions.  However in the second half of the sim significant differences emerge, primarily in the direction of weaker balances associated with the trainee decisions.  Again, understanding why these differences emerge could provide insight into how safety culture is actually being practiced within the organization. Chart 3 adds in some additional context.

Chart 3
The yellow line is a plot of “goal pressure” which is simply a sum of the differences in actual performance in the sim to goals for cost, generation and CAP program.  Higher values of pressure are associated with performance lagging the goals.  Inspection of the plot indicates that goal pressure was mostly modest in the first half of the sim before an initial spike up and further increases with time.  The blue line, the decision balance of the trainee, does not show any response to the initial spike, but later in the sim the high goal pressure could be seen as a possible contributor to decisions trending to lower balances.  A final note is that over the course of the entire sim, the average values of preferred and actual balance are fairly close for this player, perhaps suggesting reasonable overall alignment in safety priorities notwithstanding decision to decision variations. 

A variety of training benefits can flow from the decision simulation.  Comparisons of actual to preferred decisions provide a baseline indication of how well expected safety balances are being achieved in realistic decisions.  Consideration of contributing factors such as goal pressure may illustrate challenges for decision makers.  Comparisons of results among and across groups of trainees could provide further insights.  In all cases the results would provide material for discussion, team building and alignment on safety culture.

In our post dated November 4, 2011 we quoted the work of Kahneman, that organizations are “factories for producing decisions”.  In nuclear safety, the decision factory is the mechanism to actualize safety culture into specific priorities and actions.  A critical element of achieving strong safety culture is to be able to identify differences between espoused values for safety (i.e., the traits typically associated with safety culture) and de facto values as revealed in actual decisions. We believe this can be achieved by capturing decision data explicitly, including the judgments on significance and uncertainty, and the operational context of the decisions.

The next step is synthesizing the decision and situational parameters to develop a useful systems-based measure of safety culture.  A quantity that could be tracked in a simulation environment to illustrate safety culture response and provide feedback and/or during nuclear operations to provide a real time pulse of the organization’s culture.

* For more information on using system dynamics to model safety culture, please visit our companion website,

** It is possible for some decision options to have the same value of balance even though they incorporate different responses to the issue and different operational impacts.