Friday, March 19, 2010

“We have a great safety culture = deep trouble” or what squirrels can teach us...

This podcast is excerpted from an interview of Dr. James Reason in association with VoiceMap (www.voicemap.net), a provider of live guidance applications to improve human performance. In this third segment, Dr. Reason discusses impediments to safety culture. He observes that when management announces that we have a great safety culture, it should be taken as a symptom of an organization that is vulnerable. The proper posture according to Dr. Reason is the “chronic unease” that he sees embodied in squirrels and other species that see constant vulnerability, even when there is no apparent immediate threat. The inverse of chronic unease is, of course, complacency. The “c” word has been invoked more frequently of late by the NRC (see our November 12, 2009 post) which could be viewed as threat enough.

Thursday, March 18, 2010

Honest Errors vs Unacceptable Errors

This podcast is excerpted from an interview of Dr. James Reason in association with VoiceMap (www.voicemap.net), a provider of live guidance applications to improve human performance. In this second segment, Dr. Reason discusses how to build a better safety culture based on a “just” culture. He also cites the need to distinguish between honest errors (he estimates 90% of errors fall in this category) and unacceptable errors.

With regard to the importance of a “just culture” you may want to refer back to our post of August 3, 2009 where we highlight a book of that title by Sidney Dekker. In that post we emphasize the need to balance accountability and learning in the investigation of the causes of errors. Both advocates of a just culture, Reason and Dekker, are from European countries and their work may not be as well known in the U.S. nuclear industry but it appears to contain valuable lessons for us.

Wednesday, March 17, 2010

Dr. James Reason on Error Management

This podcast is excerpted from an interview of Dr. James Reason in association with VoiceMap (www.voicemap.net), a provider of live guidance applications to improve human performance.  In this first segment, Dr. Reason discusses his theory of how errors occur (person based and system based) including the existence of "error traps" within an organizational system.  Error traps are evident when different people make the same error, indicating some defect in the management system, such as something as simple as bad or ambiguous procedures.  

I believe error traps may also exist due to a more intangible conditions such as conflicting priorities or requirements on staff that may create a bias toward compromise of safety priorities.  The conditions act as a trap or decision "box" where safety compromise is either viewed as "okay" or the only viable response and even well intentioned people can be subverted.  In contrast, the competing priorities may just appear to be boxes and allow a lax person to compromise safety.  The bias toward compromising safety may actually originate in people who are predisposed to making the error, making it not a system-based error trap but a personal performance error.  How should the errors in reporting at Vermont Yankee be characterized?

Tuesday, March 16, 2010

Safety Culture Briefing of NRC Commissioners March 30, 2010

There is a briefing scheduled of the Nuclear Regulatory Commissioners on safety culture on March 30, 2010.  It will be webcast at 9:30 am Eastern time.

Additional information is available here and here.

Monday, March 15, 2010

Vermont Yankee (part 2) - What Would Reason Say?

The "Reason" in the title refers to Dr. James Reason, Professor Emeritus, Department of Psychology, University of Manchester.

“It is clear from in-depth accident analyses that some of the most powerful pushes towards local traps [characteristics of the workplace that lead people to compromise safety priorities] come from an unsatisfactory resolution of the inevitable conflict that exists (at least in the short-term) between the goals of safety and production. The cultural accommodation between the pursuit of these goals must achieve a delicate balance. On the one hand, we have to face the fact that no organization is just in the business of being safe.  Every company must obey both the ' ALARP ' principle (keep the risks as low as reasonably practicable) and the 'ASSIB' principle (and still stay in business). On the other hand, it is now increasingly clear that few organizations can survive a catastrophic organizational accident (Reason 1997).”

"Achieving a Safe Culture: Theory and Practice." (1998), p. 301.
 

Dr. Reason has been a leading and influential thinker in the area of safety and risk management in the workplace and the creation of safety culture in high risk industries.  Get to know Dr. Reason through his own words in future blog posts featuring some of his key insights.

Friday, March 12, 2010

More Drips on the BP Front

In an article dated March 9, 2010 (“BP Faces Fine Over Safety at Ohio Refinery”) the Wall Street Journal reports on more heavy fines of oil giant BP for safety issues at its refineries. OSHA has levied fines of $3 million for violations at the BP refinery in Toledo, Ohio. This follows record monetary penalties for its Texas refinery last year.

What is significant about this particular enforcement action? Principally the context of the penalties - the Obama administration is taking a tougher regulatory line and its impact may extend more broadly, say to the nuclear industry. The White House clearly "wants some of the regulatory bodies to be stronger than they have been in the past," accordingly to the article. It is hard to predict what this portends for nuclear operations, but in an environment where safety lapses are piling up (Toyota et al) the NRC may feel impelled to take aggressive actions. The initial steps being taken with regard to Vermont Yankee would be consistent with such a posture.


The other noteworthy item was the observation that BP’s refining business “is already under pressure from plummeting profit margins and weak demand for petroleum products...” Sometimes the presence of significant economic pressures is the elephant in the room that is not talked about explicitly. Companies assert that safety is the highest priority yet safety problems occur that fundamentally challenge that assertion. Why? Are business pressures trumping the safety priority? Do we need to be more open about the reality of competing priorities that a business must address at the same time it meets high safety standards? Stay tuned.

Wednesday, March 10, 2010

"Normalization of a Deviation"

These are the words of John Carlin, Vice President at the Ginna Nuclear Plant, referring to a situation in the past where chronic water leakages from the reactor refueling pit were tolerated by the plant’s former owners. 

The quote is from a piece reported by Energy & Environment Publishing’s Peter Behr in its ClimateWire online publication titled, “Aging Reactors Put Nuclear Power Plant ‘Safety Culture’ in the Spotlight” and also published in The New York Times.  The focus is on a series of incidents with safety culture implications that have occurred at the Nine Mile Point and Ginna plants now owned and operated by Constellation Energy.

The recitation of events and the responses of managers and regulators are very familiar.  The drip, drip, drip is not the sound of water leaking but the uninspired give and take of the safety culture dialogue that occurs each time there is an incident or series of incidents that suggest safety culture is not working as it should.

Managers admit they need to adopt a questioning attitude and improve the rigor of decision making; ensure they have the right “mindset”; and corporate promises “a campaign to make sure its employees across the company buy into the need for an exacting attention to safety.”  Regulators remind the licensee, "The nuclear industry remains ... just one incident away from retrenchment..." but must be wondering why these events are occurring when NRC performance indicators for the plants and INPO rankings do not indicate problems.  Pledges to improve safety culture are put forth earnestly and (I believe) in good faith.

The drip, drip, drip of safety culture failures may not be cause for outright alarm or questioning of the fundamental safety of nuclear operations, but it does highlight what seems to be a condition of safety culture stasis - a standoff of sorts where significant progress has been made but problems continue to arise, and the same palliatives are applied.  Perhaps more significantly, where continued evolution of thinking regarding safety culture has plateaued.  Peaking too early is a problem in politics and sports, and so it appears in nuclear safety culture.

This is why the remark by John Carlin was so refreshing.  For those not familiar with the context of his words, “normalization of deviation” is a concept developed by Diane Vaughan in her exceptional study of the space shuttle Challenger accident.  Readers of this blog will recall that we are fans her book, The Challenger Launch Decision, where a mechanism she identifies as “normalization of deviance” is used to explain the gradual acceptance of performance results that are outside normal acceptance criteria.  Most scary, an organization's standards can decay and no one even notices.  How this occurs and what can be done about it are concepts that should be central to current considerations of safety culture. 

For further thoughts from our blog on this subject, refer to our posts dated October 6, 2009 and November 12, 2009.  In the latter, we discuss the nature of complacency and its insidious impact on the very process that is designed to avoid it in the first place.