Wednesday, March 24, 2010

Vermont Yankee (part 3)

There was an interesting article in the March 22, 2010 Hartford Courant regarding Paul Blanch, the former Northeast Utilities engineer who was in the middle of safety issues at Millstone in the 1990s. Specifically he was in the news due to his recent testimony against the extension of the operating license for Vermont Yankee. But what caught my eye was some of his broader observations regarding safety and the nuclear industry. Regarding the industry, Blanch states, "Safety is not their No. 1 concern," he said. "Making money is their No. 1 concern." He goes on to say he has no faith in the NRC, or utilities’ commitment to safety.

Bringing attention to these comments is important not because one may agree or disagree with them. They are significant because they represent a perception of the industry, and the NRC for that matter, that can and does get attention. One problem is that everyone says safety is their highest priority but then certain events suggest otherwise - as an example, let’s look at another company and industry recently in the news:


From the BP website:


Safe and reliable operations are BP’s number one priority....


This is from a company that was recently fined over $3 million by OSHA for safety violations at its Ohio refinery (see our March 12, 2010 post) and had previously been fined almost $90 million for the explosion at its Texas refinery.


Supporting this commitment is the following description of safety management at BP:

“...members of the executive team undertook site visits, in which safety was a focus, to reinforce the importance of their commitment to safe and reliable operations. The executives also regularly included safety and operations issues in video broadcasts and communications to employees, townhall meetings and messages to senior leaders.“

It is hardly unreasonable that someone could have a perception that BP’s highest priority was not safety. Unfortunately almost those identical words can also be found in the statements and pronouncements of many nuclear utilities. (By the way the narrow emphasis by BP management on “reinforcement” might be considered in the context of our post dated March 22, 2010 on Safety Culture Dynamics.)


As Dr. Reason has noted so simply, no organization is just in the business of being safe. What might be much more beneficial is some better acknowledgment of the tension between safety and production (and cost and schedule) and how nuclear organizations are able to address it. This awareness is a more credible posture for public perception, for regulators and for the organization itself. It would also highlight the insight that many have in the nuclear industry - that safety and reliable production are actually tightly coupled - that over the long term they must coexist. The irony may be that I recall 20 years ago Entergy was the leader in publicizing (and achieving) their goals to be upper quartile in safety, production and cost.

Monday, March 22, 2010

Safety Culture Dynamics (part 1)

Over the last several years there have been a number of nuclear organizations that have encountered safety culture and climate issues at their plants. Often new leadership is brought to the plant in hopes of stimulating the needed changes in culture. Almost always there is increased training and reiteration of safety values and a safety culture survey to gain a sense of the organizational temperature. It is a little difficult to gauge precisely how effective these measures are - surveys are snapshots in time and direct indicators of safety culture are lacking. In some cases, safety culture appears to respond in the short term to these changes but then loses momentum and backslides further out in time.

How does one explain these types of evolutions in culture? Conventional wisdom has been that culture is leadership driven and when safety culture is deficient, new management can “turn around” the situation. We have argued that the dynamics of safety culture are more complex and are subject to a confluence of factors that compete for the priorities and decisions of the organization. We use simulation models of safety culture to suggest how these various factors can interact and respond to various initiatives. We made an attempt at a simple illustration of what may illustrate the situation at a plant which responds as described above. CLICK ON THIS LINK to see the simulated safety culture dynamic response.

The simulation shows changes in some key variables over time. In this case the time period is 5 years. For approximately the first year the simulation illustrates the status quo prior to the change in leadership. Safety culture was in gradual decline despite nominal attention to actions to reinforce a safety mindset in the organization.

At approximately the one year mark, leadership is changed and actions are taken to significantly increase the safety priority of the organization. This is reflected in a spike in reinforcement that typically includes training, communications and strong management emphasis on the elements of safety culture. Note that following a lag, safety culture starts to improve in response to these changes. As time progresses, the reinforcement curve peaks and starts to decay due to something we refer to as “saturation”. Essentially the new leadership’s message is starting to have less and less impact even though it is being constantly reiterated. For a time safety culture continues to improve but then turns around due to the decreasing effectiveness of reinforcement. Eventually safety culture regresses to a level where many of the same problems start to recur.

Is this a diagnosis of what is happening at any particular site? No, it is merely suggestive of some of the dynamics that are work in safety culture. In this particular simulation other actions that may be needed to build strong, enduring safety culture were not implemented in order to isolate the failure of one-dimensional actions to provide long term solutions. One of the indicators of this narrow approach can be seen in the line on the simulation representing the trust level within the organization. It hardly changes or responds to the other dynamics. Why? In our view trust tends to be driven by the overall, big picture of forces at work and the extent to which they consistently demonstrate safety priority. Reinforcement (in our model) reflects primarily a training and messaging action by management. Other more potent forces include whether management “walks the talk”, whether resources are allocated consistent with safety priorities, whether short term needs are allowed to dominate longer term priorities, whether problems are identified and corrected in a manner to prevent recurrence, etc. In this particular simulation example, these other signals are not entirely consistent with the reinforcement messages, with a net result that trust hardly changes.

More information regarding safety culture simulation is available at the nuclearsafetysim.com website. Under the Models tab, Model 3 provides a short tutorial on the concept of saturation and its effect on safety culture reinforcement.

Friday, March 19, 2010

Highly Reliable Performance Blog

The Highly Reliable Performance Blog is published by the DOE Office of Corporate Safety Analysis. The blog's focus is on Human Performance Improvement (HPI). Earl Carnes, a former colleague of Bob Cudlin's, is a blog editor.

Dr. Bill Corcoran

From time to time we will mention other safety culture professionals whose work you may find interesting. William R. Corcoran, Ph.D., P.E. has long been active in the safety field; we even shared a common employer many years ago. He publishes "The Firebird Forum," a newsletter focusing on root cause analysis. For more information on Bill and his newsletter, please visit his profile here.

“We have a great safety culture = deep trouble” or what squirrels can teach us...

This podcast is excerpted from an interview of Dr. James Reason in association with VoiceMap (www.voicemap.net), a provider of live guidance applications to improve human performance. In this third segment, Dr. Reason discusses impediments to safety culture. He observes that when management announces that we have a great safety culture, it should be taken as a symptom of an organization that is vulnerable. The proper posture according to Dr. Reason is the “chronic unease” that he sees embodied in squirrels and other species that see constant vulnerability, even when there is no apparent immediate threat. The inverse of chronic unease is, of course, complacency. The “c” word has been invoked more frequently of late by the NRC (see our November 12, 2009 post) which could be viewed as threat enough.

Thursday, March 18, 2010

Honest Errors vs Unacceptable Errors

This podcast is excerpted from an interview of Dr. James Reason in association with VoiceMap (www.voicemap.net), a provider of live guidance applications to improve human performance. In this second segment, Dr. Reason discusses how to build a better safety culture based on a “just” culture. He also cites the need to distinguish between honest errors (he estimates 90% of errors fall in this category) and unacceptable errors.

With regard to the importance of a “just culture” you may want to refer back to our post of August 3, 2009 where we highlight a book of that title by Sidney Dekker. In that post we emphasize the need to balance accountability and learning in the investigation of the causes of errors. Both advocates of a just culture, Reason and Dekker, are from European countries and their work may not be as well known in the U.S. nuclear industry but it appears to contain valuable lessons for us.

Wednesday, March 17, 2010

Dr. James Reason on Error Management

This podcast is excerpted from an interview of Dr. James Reason in association with VoiceMap (www.voicemap.net), a provider of live guidance applications to improve human performance.  In this first segment, Dr. Reason discusses his theory of how errors occur (person based and system based) including the existence of "error traps" within an organizational system.  Error traps are evident when different people make the same error, indicating some defect in the management system, such as something as simple as bad or ambiguous procedures.  

I believe error traps may also exist due to a more intangible conditions such as conflicting priorities or requirements on staff that may create a bias toward compromise of safety priorities.  The conditions act as a trap or decision "box" where safety compromise is either viewed as "okay" or the only viable response and even well intentioned people can be subverted.  In contrast, the competing priorities may just appear to be boxes and allow a lax person to compromise safety.  The bias toward compromising safety may actually originate in people who are predisposed to making the error, making it not a system-based error trap but a personal performance error.  How should the errors in reporting at Vermont Yankee be characterized?