Monday, March 22, 2010

Safety Culture Dynamics (part 1)

Over the last several years there have been a number of nuclear organizations that have encountered safety culture and climate issues at their plants. Often new leadership is brought to the plant in hopes of stimulating the needed changes in culture. Almost always there is increased training and reiteration of safety values and a safety culture survey to gain a sense of the organizational temperature. It is a little difficult to gauge precisely how effective these measures are - surveys are snapshots in time and direct indicators of safety culture are lacking. In some cases, safety culture appears to respond in the short term to these changes but then loses momentum and backslides further out in time.

How does one explain these types of evolutions in culture? Conventional wisdom has been that culture is leadership driven and when safety culture is deficient, new management can “turn around” the situation. We have argued that the dynamics of safety culture are more complex and are subject to a confluence of factors that compete for the priorities and decisions of the organization. We use simulation models of safety culture to suggest how these various factors can interact and respond to various initiatives. We made an attempt at a simple illustration of what may illustrate the situation at a plant which responds as described above. CLICK ON THIS LINK to see the simulated safety culture dynamic response.

The simulation shows changes in some key variables over time. In this case the time period is 5 years. For approximately the first year the simulation illustrates the status quo prior to the change in leadership. Safety culture was in gradual decline despite nominal attention to actions to reinforce a safety mindset in the organization.

At approximately the one year mark, leadership is changed and actions are taken to significantly increase the safety priority of the organization. This is reflected in a spike in reinforcement that typically includes training, communications and strong management emphasis on the elements of safety culture. Note that following a lag, safety culture starts to improve in response to these changes. As time progresses, the reinforcement curve peaks and starts to decay due to something we refer to as “saturation”. Essentially the new leadership’s message is starting to have less and less impact even though it is being constantly reiterated. For a time safety culture continues to improve but then turns around due to the decreasing effectiveness of reinforcement. Eventually safety culture regresses to a level where many of the same problems start to recur.

Is this a diagnosis of what is happening at any particular site? No, it is merely suggestive of some of the dynamics that are work in safety culture. In this particular simulation other actions that may be needed to build strong, enduring safety culture were not implemented in order to isolate the failure of one-dimensional actions to provide long term solutions. One of the indicators of this narrow approach can be seen in the line on the simulation representing the trust level within the organization. It hardly changes or responds to the other dynamics. Why? In our view trust tends to be driven by the overall, big picture of forces at work and the extent to which they consistently demonstrate safety priority. Reinforcement (in our model) reflects primarily a training and messaging action by management. Other more potent forces include whether management “walks the talk”, whether resources are allocated consistent with safety priorities, whether short term needs are allowed to dominate longer term priorities, whether problems are identified and corrected in a manner to prevent recurrence, etc. In this particular simulation example, these other signals are not entirely consistent with the reinforcement messages, with a net result that trust hardly changes.

More information regarding safety culture simulation is available at the nuclearsafetysim.com website. Under the Models tab, Model 3 provides a short tutorial on the concept of saturation and its effect on safety culture reinforcement.

Friday, March 19, 2010

Highly Reliable Performance Blog

The Highly Reliable Performance Blog is published by the DOE Office of Corporate Safety Analysis. The blog's focus is on Human Performance Improvement (HPI). Earl Carnes, a former colleague of Bob Cudlin's, is a blog editor.

Dr. Bill Corcoran

From time to time we will mention other safety culture professionals whose work you may find interesting. William R. Corcoran, Ph.D., P.E. has long been active in the safety field; we even shared a common employer many years ago. He publishes "The Firebird Forum," a newsletter focusing on root cause analysis. For more information on Bill and his newsletter, please visit his profile here.

“We have a great safety culture = deep trouble” or what squirrels can teach us...

This podcast is excerpted from an interview of Dr. James Reason in association with VoiceMap (www.voicemap.net), a provider of live guidance applications to improve human performance. In this third segment, Dr. Reason discusses impediments to safety culture. He observes that when management announces that we have a great safety culture, it should be taken as a symptom of an organization that is vulnerable. The proper posture according to Dr. Reason is the “chronic unease” that he sees embodied in squirrels and other species that see constant vulnerability, even when there is no apparent immediate threat. The inverse of chronic unease is, of course, complacency. The “c” word has been invoked more frequently of late by the NRC (see our November 12, 2009 post) which could be viewed as threat enough.

Thursday, March 18, 2010

Honest Errors vs Unacceptable Errors

This podcast is excerpted from an interview of Dr. James Reason in association with VoiceMap (www.voicemap.net), a provider of live guidance applications to improve human performance. In this second segment, Dr. Reason discusses how to build a better safety culture based on a “just” culture. He also cites the need to distinguish between honest errors (he estimates 90% of errors fall in this category) and unacceptable errors.

With regard to the importance of a “just culture” you may want to refer back to our post of August 3, 2009 where we highlight a book of that title by Sidney Dekker. In that post we emphasize the need to balance accountability and learning in the investigation of the causes of errors. Both advocates of a just culture, Reason and Dekker, are from European countries and their work may not be as well known in the U.S. nuclear industry but it appears to contain valuable lessons for us.

Wednesday, March 17, 2010

Dr. James Reason on Error Management

This podcast is excerpted from an interview of Dr. James Reason in association with VoiceMap (www.voicemap.net), a provider of live guidance applications to improve human performance.  In this first segment, Dr. Reason discusses his theory of how errors occur (person based and system based) including the existence of "error traps" within an organizational system.  Error traps are evident when different people make the same error, indicating some defect in the management system, such as something as simple as bad or ambiguous procedures.  

I believe error traps may also exist due to a more intangible conditions such as conflicting priorities or requirements on staff that may create a bias toward compromise of safety priorities.  The conditions act as a trap or decision "box" where safety compromise is either viewed as "okay" or the only viable response and even well intentioned people can be subverted.  In contrast, the competing priorities may just appear to be boxes and allow a lax person to compromise safety.  The bias toward compromising safety may actually originate in people who are predisposed to making the error, making it not a system-based error trap but a personal performance error.  How should the errors in reporting at Vermont Yankee be characterized?

Tuesday, March 16, 2010

Safety Culture Briefing of NRC Commissioners March 30, 2010

There is a briefing scheduled of the Nuclear Regulatory Commissioners on safety culture on March 30, 2010.  It will be webcast at 9:30 am Eastern time.

Additional information is available here and here.