Showing posts with label Nuclearsafetysim. Show all posts
Showing posts with label Nuclearsafetysim. Show all posts

Thursday, January 9, 2014

Safety Culture Training Labs

Not a SC Training Lab
This post highlights a paper* Carlo Rusconi presented at the American Nuclear Society meeting last November.  He proposes the use of “training labs” to develop improved safety culture (SC) through the use of team-building exercises, e.g., role play, and table-top simulations.  Team building increases (a) participants' awareness of group dynamics, e.g., feedback loops, and how a group develops shared beliefs and (b) sensitivity to the viewpoints of others, viewpoints that may differ greatly based on individual experience and expectations.  The simulations pose evolving scenarios that participants must analyze and develop a team approach for addressing.  A key rationale for this type of training is “team interactions, if properly developed and trained, have the capacity to counter-balance individual errors.” (p. 2155)

Rusconi's recognition of goal conflict in organizations, the weakness of traditional methods (e.g., PRA) for anticipating human reactions to emergent issues, the need to recognize different perspectives on the same problem and the value of simulation in training are all familiar themes here at Safetymatters.

Our Perspective

Rusconi's work also reminds us how seldom new approaches for addressing SC concepts, issues, training and management appear in the nuclear industry.  Per Rusconi, “One of the most common causes of incidents and accidents in the industrial sector is the presence of hidden or clear conflicts in the organization. These conflicts can be horizontal, in departments or in working teams, or vertical, between managers and workers.” (p. 2156)  However, we see scant evidence of the willingness of the nuclear industry to acknowledge and address the influence of goal conflicts.

Rusconi focuses on training to help recognize and overcome conflicts.  This is good but one needs to be careful to clearly identify how training would do this and its limitations. For example, if promotion is impacted by raising safety issues or advocating conservative responses, is training going to be an effective remedy?  The truth is there are some conflicts which are implicit (but very real) and hard to mitigate. Such conflicts can arise from corporate goals, resource allocation policies and performance-based executive compensation schemes.  Some of these conflicts originate high in the organization and are not really amenable to training per se.

Both Rusconi's approach and our NuclearSafetySim tool attempt to stimulate discussion of conflicts and develop rules for resolving them.  Creating a measurable framework tied to the actual decisions made by the organization is critical to dealing with conflicts.  Part of this is creating measures for how well decisions embody SC, as done in NuclearSafetySim.

Perhaps this means the only real answer for high risk industries is to have agreement on standards for safety decisions.  This doesn't mean some highly regimented PRA-type approach.  It is more of a peer type process incorporating scales for safety significance, decision quality, etc.  This should be the focus of the site safety review committees and third-party review teams.  And the process should look at samples of all decisions not just those that result in a problem and wind up in the corrective action program (CAP).

Nuclear managers would probably be very reluctant to embrace this much transparency.  A benign view is they are simply too comfortable believing that the "right" people will do the "right" thing.  A less charitable view is their lack of interest in recognizing goal conflicts and other systemic issues is a way to effectively deny such issues exist.

Instead of interest in bigger-picture “Why?” questions we see continued introspective efforts to refine existing methods, e.g., cause analysis.  At its best, cause analysis and any resultant interventions can prevent the same problem from recurring.  At its worst, cause analysis looks for a bad component to redesign or a “bad apple” to blame, train, oversee and/or discipline.

We hate to start the new year wearing our cranky pants but Dr. Rusconi, ourselves and a cadre of other SC analysts are all advocating some of the same things.  Where is any industry support, dialogue, or interaction?  Are these ideas not robust?  Are there better alternatives?  It is difficult to understand the lack of engagement on big-picture questions by the industry and the regulator.

*  C. Rusconi, “Training labs: a way for improving Safety Culture,” Transactions of the American Nuclear Society, Vol. 109, Washington, D.C., Nov. 10–14, 2013, pp. 2155-57.  This paper reflects a continuation of Dr. Rusconi's earlier work which we posted on last June 26, 2013.

Friday, September 27, 2013

Four Years of Safetymatters

Aztec Calendar
Over the four plus years we have been publishing this blog, regular readers will have noticed some recurring themes in our posts.  The purpose of this post is to summarize our perspective on these key themes.  We have attempted to build a body of work that is useful and insightful for you.

Systems View

We have consistently considered safety culture (SC) in the nuclear industry to be one component of a complicated socio-technical system.  A systems view provides a powerful mental model for analyzing and understanding organizational behavior. 

Our design and explicative efforts began with system dynamics as described by authors such as Peter Senge, focusing on characteristics such as feedback loops and time delays that can affect system behavior and lead to unexpected, non-linear changes in system performance.  Later, we expanded our discussion to incorporate the ways systems adapt and evolve over time in response to internal and external pressures.  Because they evolve, socio-technical organizations are learning organizations but continuous improvement is not guaranteed; in fact, evolution in response to pressure can lead to poorer performance.

The systems view, system dynamics and their application through computer simulation techniques are incorporated in the NuclearSafetySim management training tool.

Decision Making

A critical, defining activity of any organization is decision making.  Decision making determines what will (or will not) be done, by whom, and with what priority and resources.  Decision making is  directed and constrained by factors including laws, regulations, policies, goals, procedures and resource availability.  In addition, decision making is imbued with and reflective of the organization's values, mental models and aspirations, i.e., its culture, including safety culture.

Decision making is intimately related to an organization's financial compensation and incentive program.  We've commented on these programs in nuclear and non-nuclear organizations and identified the performance goals for which executives received the largest rewards; often, these were not safety goals.

Decision making is part of the behavior exhibited by senior managers.  We expect leaders to model desired behavior and are disappointed when they don't.  We have provided examples of good and bad decisions and leader behavior. 

Safety Culture Assessment

We have cited NRC Commissioner Apostolakis' observation that “we really care about what people do and maybe not why they do it . . .”  We sympathize with that view.  If organizations are making correct decisions and getting acceptable performance, the “why” is not immediately important.  However, in the longer run, trying to identify the why is essential, both to preserve organizational effectiveness and to provide a management (and mental) model that can be transported elsewhere in a fleet or industry.

What is not useful, and possibly even a disservice, is a feckless organizational SC “analysis” that focuses on a laundry list of attributes or limits remedial actions to retraining, closer oversight and selective punishment.  Such approaches ignore systemic factors and cannot provide long-term successful solutions.

We have always been skeptical of the value of SC surveys.  Over time, we saw that others shared our view.  Currently, broad-scope, in-depth interviews and focus groups are recognized as preferred ways to attempt to gauge an organization's SC and we generally support such approaches.

On a related topic, we were skeptical of the NRC's SC initiatives, which culminated in the SC Policy Statement.  As we have seen, this “policy” has led to back door de facto regulation of SC.

References and Examples

We've identified a library of references related to SC.  We review the work of leading organizational thinkers, social scientists and management writers, attempt to accurately summarize their work and add value by relating it to our views on SC.  We've reported on the contributions of Dekker, Dörner, Hollnagel, Kahneman, Perin, Perrow, Reason, Schein, Taleb, Vaughan, Weick and others.

We've also posted on the travails of organizations that dug themselves into holes that brought their SC into question.  Some of these were relatively small potatoes, e.g., Vermont Yankee and EdF, but others were actual disasters, e.g., Massey Energy and BP.  We've also covered DOE, especially the Hanford Waste Treatment and Immobilization Plant (aka the Vit plant).


We believe the nuclear industry is generally well-managed by well-intentioned personnel but can be affected by the natural organizational ailments of complacency, normalization of deviation, drift, hubris, incompetence and occasional criminality.  Our perspective has evolved as we have learned more about organizations in general and SC in particular.  Channeling John Maynard Keynes, we adapt our models when we become aware of new facts or better ways of looking at the data.  We hope you continue to follow Safetymatters.  

Tuesday, September 17, 2013

Even Macy’s Does It

We have long been proponents of looking for innovative ways to improve safety management training for nuclear professionals.  We’ve taken the burden to develop a prototype management simulator, NuclearSafetySim, and made it available to our readers to experience for themselves (see our July 30, 2013 post).  In the past we have also noted other industries and organizations that have embraced simulation as an effective management training tool.

An August article in the Wall Street Journal* cites several examples of new approaches to manager training.  Most notable in our view is Macy’s use of simulations to have managers gain decision making experience.  As the article states:

“The simulation programs aim to teach managers how their daily decisions can affect the business as a whole.”

We won’t revisit all the arguments that we’ve made for taking a systems view of safety management, focusing on decisions as the essence of safety culture and using simulation to allow personnel to actualize safety values and priorities.  All of these could only enrich, challenge and stimulate training activities. 

A Clockwork Magenta

On the other hand what is the value of training approaches that reiterate INPO slide shows, regulatory policy statements and good practices in seemingly endless iterations?  Brings to mind the character Alex, the incorrigible sociopath in A Clockwork Orange with an unusual passion for classical music.**  He is the subject of “reclamation treatment”, head clamped in a brace and eyes pinned wide open, forced to watch repetitive screenings of anti-social behavior to the music of Beethoven’s Fifth.  We are led to believe this results in a “cure” but does it and at what cost?

Nuclear managers may not be treated exactly like Alex but there are some similarities.  After plant problems occur and are diagnosed, managers are also declared “cured” after each forced feeding of traits, values, and the need for increased procedure adherence and oversight.  Results still not satisfactory?  Repeat.

*  R. Feintzeig, "Building Middle-Manager Morale," Wall Street Journal (Aug. 7, 2013).  Retrieved Sept. 24, 2013.

**  M. Amis, "The Shock of the New:‘A Clockwork Orange’ at 50,"  New York Times Sunday Book Review (Aug. 31, 2013).  Retrieved Sept. 24, 2013.

Tuesday, July 30, 2013

Introducing NuclearSafetySim

We have referred to NuclearSafetySim and the use of simulation tools on a regular basis in this blog.  NuclearSafetySim is our initiative to develop a new approach to safety management training for nuclear professionals.  It utilizes a simulator to provide a realistic nuclear operations environment within which players are challenged by emergent issues - where they must make decisions balancing safety implications and other priorities - over a five year period.  Each player earns an overall score and is provided with analyses and data on his/her decision making and performance against goals.  It is clearly a different approach to safety culture training, one that attempts to operationalize the values and traits espoused by various industry bodies.  In that regard it is exactly what nuclear professionals must do on a day to day basis. 

At this time we are making NuclearSafetySim available to our readers through a web-based demo version.  To get started you need to access the NuclearSafetySim website.  Click on the Introduction tab at the top of the Home page.  Here you will find a link to a narrated slide show that provides important background on the approach used in the simulation.  It runs about 15 minutes.  Then click on the Simulation tab.  Here you will find another video which is a demo of NuclearSafetySim.  While this runs about 45 minutes (apologies) it does provide a comprehensive tutorial on the sim and how to interact with it.  We urge you to view it. the bottom of the Simulation page is a link to the NuclearSafetySim tool.  Clicking on the link brings you directly to the Home screen and you’re ready to play.

As you will see on the website and in the sim itself, there are reminders and links to facilitate providing feedback on NuclearSafetySim and/or requesting additional information.  This is important to us and we hope our readers will take the time to provide thoughtful input, including constructive criticism.  We welcome all comments. 

Wednesday, September 22, 2010

Games Theory

In the September 15, 2010 New York Times there is an interesting article* about the increasing recognition within school environments that game-base learning has great potential.  We cite this article as further food for thought about our initiatives to bring simulation-based games to training for nuclear safety management.

The benefits of using games as learning spaces is based on the insight that games are systems, and systems thinking is really the curriculum, bringing a nuanced and rich way of looking at real world situations. 

“Games are just one form of learning from experience. They give learners well-designed experiences that they cannot have in the real world (like being an electron or solving the crisis in the Middle East). The future for learning games, in my view, is in games that prepare people to solve problems in the world.” **

“A game….is really just a “designed experience,” in which a participant is motivated to achieve a goal while operating inside a prescribed system of boundaries and rules.” ***  The analogy in nuclear safety management is to have the game participants manage a nuclear operation - with defined budgets and performance goals - in a manner that achieves certain safety culture attributes even as achievement of those attributes comes into conflict with other business needs.  The game context brings an experiential dimension that is far more participatory and immersive than traditional training environments.  In the NuclearSafetySim simulation, the players’ actions and decisions also feedback into the system, impacting other factors such as  organizational trust and the willingness of personnel to identify deviations.  Experiencing the loss of trust in the simulation is likely to be a much more powerful lesson than simply the admonition to “walk the talk” burned into a Powerpoint slide.

* Sara Corbett, "Learning by Playing: Video Games in the Classroom," New York Times (Sep 15, 2010).

** J.P. Gee, "Part I: Answers to Questions About Video Games and Learning," New York Times (Sep 20, 2010).

*** "Learning by Playing," p. 3 of retrieved article.

Friday, June 18, 2010

Assessing Safety Culture

In our June 15th post, we reported on Wahlström and Rollenhagen’s* concern that trying to measure safety culture could do more harm than good. However, the authors go on to assert that safety culture can and should be assessed. They identify different methods that can be used to perform such assessments, including peer reviews and self assessments. They conclude “Ideally safety culture assessments should be carried out as an interaction between an assessment team and a host organization and it should be aimed at the creation of an awareness of potential safety threats . . . .” (§ 7) We certainly agree with that observation.

We are particularly interested in their comments on safety (performance) indicators, another tool for assessing safety culture. We agree that “. . . most indicators are lagging in the sense that they summarize past safety performance” (§ 6.2) and thus may not be indicative of future performance. In an effort to improve performance indicators, the authors suggest “One approach towards leading safety indicators may be to start with a set of necessary conditions from which one can obtain a reasonable model of how safety is constructed. The necessary conditions would then suggest a set of variables that may be assessed as precursors for safety. An assessment could then be obtained using an ordinal scale and several variables could be combined to set an alarm level.” (ibid.)

We believe the performance indicator problem should be approached somewhat differently. Safety culture, safety management and safety performance do not exist in a vacuum. We advocate using the principles of system dynamics to construct an organizational performance model that shows safety as both input to and output from other, sometimes competing organizational goals, resource constraints and management actions. This is a more robust approach because it can not only show that safety culture is getting stronger or slipping, but why, i.e., what other organizational factors are causing safety culture change to occur. If the culture is slipping, then analysis of system information can suggest where the most cost-effective interventions can be made. For more information on using system dynamics to model safety culture, please visit our companion website,

* Björn Wahlström, Carl Rollenhagen. Assessments of safety culture – to measure or not? Paper presented at the 14th European Congress of Work and Organizational Psychology, May 13-16, 2009, Santiago de Compostela, Spain. The authors are also connected with the LearnSafe project, which we have discussed in earlier posts (click the LearnSafe label to see them.)

Monday, March 22, 2010

Safety Culture Dynamics (part 1)

Over the last several years there have been a number of nuclear organizations that have encountered safety culture and climate issues at their plants. Often new leadership is brought to the plant in hopes of stimulating the needed changes in culture. Almost always there is increased training and reiteration of safety values and a safety culture survey to gain a sense of the organizational temperature. It is a little difficult to gauge precisely how effective these measures are - surveys are snapshots in time and direct indicators of safety culture are lacking. In some cases, safety culture appears to respond in the short term to these changes but then loses momentum and backslides further out in time.

How does one explain these types of evolutions in culture? Conventional wisdom has been that culture is leadership driven and when safety culture is deficient, new management can “turn around” the situation. We have argued that the dynamics of safety culture are more complex and are subject to a confluence of factors that compete for the priorities and decisions of the organization. We use simulation models of safety culture to suggest how these various factors can interact and respond to various initiatives. We made an attempt at a simple illustration of what may illustrate the situation at a plant which responds as described above. CLICK ON THIS LINK to see the simulated safety culture dynamic response.

The simulation shows changes in some key variables over time. In this case the time period is 5 years. For approximately the first year the simulation illustrates the status quo prior to the change in leadership. Safety culture was in gradual decline despite nominal attention to actions to reinforce a safety mindset in the organization.

At approximately the one year mark, leadership is changed and actions are taken to significantly increase the safety priority of the organization. This is reflected in a spike in reinforcement that typically includes training, communications and strong management emphasis on the elements of safety culture. Note that following a lag, safety culture starts to improve in response to these changes. As time progresses, the reinforcement curve peaks and starts to decay due to something we refer to as “saturation”. Essentially the new leadership’s message is starting to have less and less impact even though it is being constantly reiterated. For a time safety culture continues to improve but then turns around due to the decreasing effectiveness of reinforcement. Eventually safety culture regresses to a level where many of the same problems start to recur.

Is this a diagnosis of what is happening at any particular site? No, it is merely suggestive of some of the dynamics that are work in safety culture. In this particular simulation other actions that may be needed to build strong, enduring safety culture were not implemented in order to isolate the failure of one-dimensional actions to provide long term solutions. One of the indicators of this narrow approach can be seen in the line on the simulation representing the trust level within the organization. It hardly changes or responds to the other dynamics. Why? In our view trust tends to be driven by the overall, big picture of forces at work and the extent to which they consistently demonstrate safety priority. Reinforcement (in our model) reflects primarily a training and messaging action by management. Other more potent forces include whether management “walks the talk”, whether resources are allocated consistent with safety priorities, whether short term needs are allowed to dominate longer term priorities, whether problems are identified and corrected in a manner to prevent recurrence, etc. In this particular simulation example, these other signals are not entirely consistent with the reinforcement messages, with a net result that trust hardly changes.

More information regarding safety culture simulation is available at the website. Under the Models tab, Model 3 provides a short tutorial on the concept of saturation and its effect on safety culture reinforcement.

Thursday, September 10, 2009

Schrodinger’s Bat

This post follows on the issue of whether safety culture is a concept unto itself or a state that is defined by many constituent actions.  Some of our own thinking about safety culture in developing the nuclearsafetysim website and tools led us to prefer a focus on safety management as opposed to safety culture.  Safety management includes the key “levers” of organizational performance (e.g., resource allocation, problem idenfication and resolution, building of trust, etc.) and the integrated effect of the manipulation of these levers results in a safety culture “value” in the simulation.  Thus all the dynamics flow from actions and decisions to a safety culture resultant, not the reverse.

Dare I put forth a sports analogy?  In baseball there is a defined “strike zone”.   In theory the umpire uses the strike zone to make calls of balls and strikes.  But the zone is really open to interpretation in the dynamic, three dimensional world of pitching and umpiring.  The reality is that the strike zone becomes the space delineated by the aggregate set of balls and strike calls by an umpire.  It relies on the skill of the umpire, his understanding of the strike zone and his commitment to making accurate calls. The linked article provides some interesting data on the strike zone and the psychology of umpires' decisions.

Link to "Schrodinger’s Bat" July 26, 2007

Thursday, August 13, 2009

Primer on System Dynamics

System Dynamics is a concept for seeing the world in terms of inputs and outputs, where internal feedback loops and time delays can affect system behavior and lead to complex, non-linear changes in system performance.

The System Dynamics worldview was originally developed by Prof. Jay Forrester at MIT. Later work by other thinkers, e.g., Peter Senge, author of The Fifth Discipline, expanded the original concepts and made them available to a broader audience. An overview of System Dynamics can be found on Wikipedia.

Our NuclearSafetySim program uses System Dynamics to model managerial behavior in an environment where maintaining the nuclear safety culture is a critical element. NuclearSafetySim is built using isee Systems iThink software. isee Systems has educational materials available on their website that explain some basic concepts.

There are other vendors in the System Dynamics software space, including Ventana Systems and their Vensim program. They also provide some reference materials, available here.

Thursday, August 6, 2009

Signs of a Reactive Organization (MIT #6)

One of the most important insights to be gained from a systems perspective of safety management is the effectiveness of various responses to changes in system conditions.  Recall that in our post #3 on the MIT paper, we talked about single versus double loop learning.  Single loop response included short term, local responses to perceived problems while double loop referred to gaining an understanding of the underlying reasons for the problems and finding long term solutions.  As you might guess, single loop responses tend to be reactive.  “An oscillating incident rate is the hallmark of a reactive organization, where successive crises lead to short term fixes that persist only until the next crisis.” [pg 22]  We can use our NuclearSafetySim model to illustrate differing approaches to managing problems.

The figure below illustrates how the number of problems/issues (we use the generic term "challenges" in NuclearSafetySim) might vary with time when the response is reactive.  The blue line indicates the total number of issues, the pink line the number of new issues being identified and the green line, the resolution rate for issues, e.g., through a corrective action program.  Note that the blue line initially increases and then oscillates while the pink line is relatively constant.  The oscillation derives from the management response, reflected in the green line, where there is an initial delay in responding to an increased numbers of issues, then resolution rates are greatly increased to address higher backlogs, then reduced (due to budgetary pressures and other priorities) when backlogs start to fall, precipitating another cycle of increasing issues.

Compare the oscillatory response above to the next figure where an increase in issues results immediately in higher resolution rates that are maintained over a period sufficient to return the system to a lower level of backlogs.  In parallel, budgets are increased to address the underlying causes of issues, driving down the occurrence rate of new issues and ultimately bringing backlog down to a long-term sustainable level.

The last figure shows some of the ramifications of system management on safety culture and employee trust.  The significant increase in issues backlog initially leads to a degradation of employee trust (the pink line) and an erosion in safety culture (blue line).  However the nature and effectiveness of the management response in bringing down backlogs and reducing new issues reverses the trust trend line and rebuilds safety culture over time.  Note the red line, representing plant performance, is relatively unchanged over the same period indicating that performance issues may exist under the cover of a consistently operating plant.

Friday, July 24, 2009

Safety Culture Insights from Simulation (MIT #1)

Starting with this post we are reviewing an interesting paper from the Sloan School of Management at MIT - “Preventing Accidents and Building A Culture of Safety: Insights from a Simulation Model.”  While the paper  approaches organizational safety performance on a generic basis – it is not specific to nuclear facilities - it offers many useful insights that are highly applicable to nuclear organizations.  MIT as an institution is known for its work in systems dynamics and simulation modeling.  These disciplines have been used to analyze a variety of safety and accident environments including NASA and nuclear operations.  Also, as you may have noticed on, our development of nuclear safety management simulation models is based on systems dynamics.

Future posts will highlight several of the key insights from this paper and their applicability to issues of nuclear safety management.

Link to paper.