Wednesday, March 31, 2010

Can Blogging Be Good for Safety Culture?

I came across a very interesting idea in some of my recent web browsing - an idea that I like for several reasons. First, it centers on an approach of using a blog or blogging, to enhance safety culture in large, high risk organizations. Hard for someone writing a safety culture blog not to find the idea intriguing. Second, the idea emanated from an engineer at NASA, Dale Huls, at the Johnson Space Center in Houston. NASA has been directly and significantly challenged in safety issues multiple times, including the Challenger and Columbia shuttle accidents. Third, the idea was presented in a PAPER written five years ago when blogging was still a shadow of what it has become - now of course it occupies its own world, the blogosphere.

The thesis of Dale’s paper is “...to explore an innovative approach to culture change at NASA that goes beyond reorganizations, management training, and a renewed emphasis on safety.” (p.1) Whatever you may conclude about blogging as an approach, I do think it is time to look beyond the standard recipe of “fixes” that Dale enumerates and which the nuclear industry also follows almost as black letter law.

One of the benefits that Dale sees is that “Blogs could be a key component to overcoming NASA’s ‘silent safety culture.’ As a communications tool, blogs are used to establish trust...(p.1)....and to create and promote a workplace climate in which dissent can be constructively addressed and resolved.” (p.2) It seems to me that almost any mechanism that promotes safety dialogue could be beneficial. Blogs encourage participation and communication. Even if many visitors to the blog only read posts and do not post themselves, they are part of the discussion. To the extent that managers and even senior executives participate, it can provide a direct, unfiltered path to connect with people in the organization. All of this promotes trust, understanding, and openness. While these are things that management can preach, bringing about the reality can be more difficult.

“Note that blogs are not expected to replace formal lines of communication, but rather enhance those communication lines with an informal process that encourages participation without peer pressure or fear of retribution.” (p.2) In any event, much of a useful safety dialogue is broader than raising a particular safety issue or concern.

So you might be wondering, did NASA implement blogging to facilitate its safety culture change? Dale wrote me in an email, “While NASA did not formally take up a specific use of blogging for safety matters, it seems that NASA is beginning to embrace the blogging culture. Several prominent NASA members utilize blogging to address NASA culture, e.g., Wayne Hale, manager of the Space Shuttle Program.

What should the nuclear industry take away from this? It might start with a question or two. Are there informal communication media such as blogs active at individual nuclear plants and how are they viewed by employees? Are they supported in any way by management, or the organization, or the industry? Are there any nuclear industry blogs that fulfill a comparable role? There is the Nuclear Safety Culture Group on LinkedIn that has seen sporadic discussion and commenting on a few issues. It currently has 257 members. This would be a good topic for some input from those who know of other forums.

Monday, March 29, 2010

Well Done by NRC Staffer

To support the discussion items on this blog we spend time ferreting out interesting pieces of information that bear on the issue of nuclear safety culture and promote further thought within the nuclear community. This week brought us to the NRC website and its Key Topics area.

As probably most of you are aware, the NRC hosted a workshop in February of this year for further discussions of safety culture definitions. In general we believe that the amount of time and attention being given to definitional issues currently seems to be at the point of diminishing returns. When we examine safety culture performance issues that arise around the industry, it is not apparent that confusion over the definition of safety culture is a serious causal issue, i.e., that someone was thinking of the INPO definition of safety culture instead of the INSAG one or the Schein perspective. Perhaps it must be a step in the process but to us what is interesting, and of paramount importance, is what causes disconnects between safety beliefs and actions taken and what can be done about them?


Thus, it was heartening and refreshing to see a presentation that addressed the key issue of culture and actions head-on. Most definitions of safety culture are heavy on descriptions of commitment, values, beliefs and attributes and light on the actual behaviors and decisions people make everyday. However, the definition that caught our attention was:


“The values, attitudes, motivations and knowledge that affect the extent to which safety is emphasized over competing goals in decisions and behavior.”

(Dr. Valerie Barnes, USNRC, “What is Safety Culture”, Powerpoint presentation, NRC workshop on safety culture, February 2010, p. 13)

This definition acknowledges the existence of competing goals and the need to address the bottom line manifestation of culture: decisions and actual behavior. We would prefer “actions” to “behavior” as it appears that behavior is often used or meant in the context of process or state of mind. Actions, as with decisions, signify to us the conscious and intentional acts of individuals. The definition also focuses on result in another way - “the extent to which safety is emphasized . . . in decisions. . . .” [emphasis added] What counts is not just the act of emphasizing, i.e. stressing or highlighting, safety but the extent to which safety impacts decisions made, or actions taken.


For similar reasons we think Dr. Barnes' definition is superior to the definition that was the outcome of the workshop:


“Nuclear safety culture is the core values and behaviors resulting from a collective commitment by leaders and individuals to emphasize safety over competing goals to ensure protection of people and the environment.”


(Workshop Summary, March 12, 2010, ADAMS ACCESSION NUMBER ML100700065, p.2)


As we previously argued in a 2008 white paper:


“. . . it is hard to avoid the trap that beliefs may be definitive but decisions and actions often are much more nuanced. . . .


"First, safety management requires balancing safety and other legitimate business goals, in an environment where there are few bright lines defining what is adequately safe, and where there are significant incentives and penalties associated with both types of goals. As a practical matter, ‘Safety culture is fragile.....a balance of people, problems and pressures.’


"Second, safety culture in practice is “situational”, and is continually being re-interpreted based on people’s actual behaviors and decisions in the safety management process. Safety culture beliefs can be reinforced or challenged through the perception of each action (or inaction), yielding an impact on culture that can be immediate or incubate gradually over time.”


(Robert Cudlin, "Practicing Nuclear Safety Management," March 2008, p. 3)


We hope the Barnes definition gets further attention and helps inform this aspect of safety culture policy.

Friday, March 26, 2010

Because They Don’t Understand...?

This post's title is part of a quote from the book Switch: How to Change Things When Change is Hard that we introduced in our March 7, 2010 post. The full quote is:

“It can sometimes be challenging.....to distinguish why people don’t support your change. Is it because they don’t understand or because they’re not enthused?....The answer isn’t always obvious, even to experts.” [p. 107]

So it appears that when people don’t comply with prescribed standards or regimens, the problem may not be knowledge or understanding, it may be something tied to emotion. Bringing about change in something as deeply embedded as culture is not simply a matter of clicking on the new, desired program. The authors provide a number of interesting examples of situations resistant to change and how they have been overcome through using emotion to galvanize action. There are teenagers with cancer who play video games that help them visualize beating the cancer. The accounting manager who changes his priorities after visiting his not-for-profit vendor organizations and experiencing for himself their limited resources and the dire consequences of late reimbursements.


The most common situation for generating emotion sufficient to support change is a crisis, often an organizational crisis that is existential. But crisis is associated with “negative emotions” that may yield specific but not necessarily long lasting actions. Positive emotions on the other hand can lead to being more open to new thoughts and values and a mindset that wants to adopt what is essentially a new identity. One of the more effective ways to generate the needed positive emotion is through experiencing (e.g., using a video game, or immersion in the environment of a stakeholder) the conditions associated with the needed changes.


In nuclear safety management, how often after events that are deemed to be indicative of safety culture weakness, are personnel provided with additional training on expectations and elements of safety culture. Does this appear to be a knowledge-based approach? If so is the problem that staff don’t understand what is expected? Or is positive emotion the missing ingredient - the addition of which might help personnel want to identify with and inhabit the cultural values?

Wednesday, March 24, 2010

Vermont Yankee (part 3)

There was an interesting article in the March 22, 2010 Hartford Courant regarding Paul Blanch, the former Northeast Utilities engineer who was in the middle of safety issues at Millstone in the 1990s. Specifically he was in the news due to his recent testimony against the extension of the operating license for Vermont Yankee. But what caught my eye was some of his broader observations regarding safety and the nuclear industry. Regarding the industry, Blanch states, "Safety is not their No. 1 concern," he said. "Making money is their No. 1 concern." He goes on to say he has no faith in the NRC, or utilities’ commitment to safety.

Bringing attention to these comments is important not because one may agree or disagree with them. They are significant because they represent a perception of the industry, and the NRC for that matter, that can and does get attention. One problem is that everyone says safety is their highest priority but then certain events suggest otherwise - as an example, let’s look at another company and industry recently in the news:


From the BP website:


Safe and reliable operations are BP’s number one priority....


This is from a company that was recently fined over $3 million by OSHA for safety violations at its Ohio refinery (see our March 12, 2010 post) and had previously been fined almost $90 million for the explosion at its Texas refinery.


Supporting this commitment is the following description of safety management at BP:

“...members of the executive team undertook site visits, in which safety was a focus, to reinforce the importance of their commitment to safe and reliable operations. The executives also regularly included safety and operations issues in video broadcasts and communications to employees, townhall meetings and messages to senior leaders.“

It is hardly unreasonable that someone could have a perception that BP’s highest priority was not safety. Unfortunately almost those identical words can also be found in the statements and pronouncements of many nuclear utilities. (By the way the narrow emphasis by BP management on “reinforcement” might be considered in the context of our post dated March 22, 2010 on Safety Culture Dynamics.)


As Dr. Reason has noted so simply, no organization is just in the business of being safe. What might be much more beneficial is some better acknowledgment of the tension between safety and production (and cost and schedule) and how nuclear organizations are able to address it. This awareness is a more credible posture for public perception, for regulators and for the organization itself. It would also highlight the insight that many have in the nuclear industry - that safety and reliable production are actually tightly coupled - that over the long term they must coexist. The irony may be that I recall 20 years ago Entergy was the leader in publicizing (and achieving) their goals to be upper quartile in safety, production and cost.

Monday, March 22, 2010

Safety Culture Dynamics (part 1)

Over the last several years there have been a number of nuclear organizations that have encountered safety culture and climate issues at their plants. Often new leadership is brought to the plant in hopes of stimulating the needed changes in culture. Almost always there is increased training and reiteration of safety values and a safety culture survey to gain a sense of the organizational temperature. It is a little difficult to gauge precisely how effective these measures are - surveys are snapshots in time and direct indicators of safety culture are lacking. In some cases, safety culture appears to respond in the short term to these changes but then loses momentum and backslides further out in time.

How does one explain these types of evolutions in culture? Conventional wisdom has been that culture is leadership driven and when safety culture is deficient, new management can “turn around” the situation. We have argued that the dynamics of safety culture are more complex and are subject to a confluence of factors that compete for the priorities and decisions of the organization. We use simulation models of safety culture to suggest how these various factors can interact and respond to various initiatives. We made an attempt at a simple illustration of what may illustrate the situation at a plant which responds as described above. CLICK ON THIS LINK to see the simulated safety culture dynamic response.

The simulation shows changes in some key variables over time. In this case the time period is 5 years. For approximately the first year the simulation illustrates the status quo prior to the change in leadership. Safety culture was in gradual decline despite nominal attention to actions to reinforce a safety mindset in the organization.

At approximately the one year mark, leadership is changed and actions are taken to significantly increase the safety priority of the organization. This is reflected in a spike in reinforcement that typically includes training, communications and strong management emphasis on the elements of safety culture. Note that following a lag, safety culture starts to improve in response to these changes. As time progresses, the reinforcement curve peaks and starts to decay due to something we refer to as “saturation”. Essentially the new leadership’s message is starting to have less and less impact even though it is being constantly reiterated. For a time safety culture continues to improve but then turns around due to the decreasing effectiveness of reinforcement. Eventually safety culture regresses to a level where many of the same problems start to recur.

Is this a diagnosis of what is happening at any particular site? No, it is merely suggestive of some of the dynamics that are work in safety culture. In this particular simulation other actions that may be needed to build strong, enduring safety culture were not implemented in order to isolate the failure of one-dimensional actions to provide long term solutions. One of the indicators of this narrow approach can be seen in the line on the simulation representing the trust level within the organization. It hardly changes or responds to the other dynamics. Why? In our view trust tends to be driven by the overall, big picture of forces at work and the extent to which they consistently demonstrate safety priority. Reinforcement (in our model) reflects primarily a training and messaging action by management. Other more potent forces include whether management “walks the talk”, whether resources are allocated consistent with safety priorities, whether short term needs are allowed to dominate longer term priorities, whether problems are identified and corrected in a manner to prevent recurrence, etc. In this particular simulation example, these other signals are not entirely consistent with the reinforcement messages, with a net result that trust hardly changes.

More information regarding safety culture simulation is available at the nuclearsafetysim.com website. Under the Models tab, Model 3 provides a short tutorial on the concept of saturation and its effect on safety culture reinforcement.

Friday, March 19, 2010

Highly Reliable Performance Blog

The Highly Reliable Performance Blog is published by the DOE Office of Corporate Safety Analysis. The blog's focus is on Human Performance Improvement (HPI). Earl Carnes, a former colleague of Bob Cudlin's, is a blog editor.

Dr. Bill Corcoran

From time to time we will mention other safety culture professionals whose work you may find interesting. William R. Corcoran, Ph.D., P.E. has long been active in the safety field; we even shared a common employer many years ago. He publishes "The Firebird Forum," a newsletter focusing on root cause analysis. For more information on Bill and his newsletter, please visit his profile here.

“We have a great safety culture = deep trouble” or what squirrels can teach us...

This podcast is excerpted from an interview of Dr. James Reason in association with VoiceMap (www.voicemap.net), a provider of live guidance applications to improve human performance. In this third segment, Dr. Reason discusses impediments to safety culture. He observes that when management announces that we have a great safety culture, it should be taken as a symptom of an organization that is vulnerable. The proper posture according to Dr. Reason is the “chronic unease” that he sees embodied in squirrels and other species that see constant vulnerability, even when there is no apparent immediate threat. The inverse of chronic unease is, of course, complacency. The “c” word has been invoked more frequently of late by the NRC (see our November 12, 2009 post) which could be viewed as threat enough.

Thursday, March 18, 2010

Honest Errors vs Unacceptable Errors

This podcast is excerpted from an interview of Dr. James Reason in association with VoiceMap (www.voicemap.net), a provider of live guidance applications to improve human performance. In this second segment, Dr. Reason discusses how to build a better safety culture based on a “just” culture. He also cites the need to distinguish between honest errors (he estimates 90% of errors fall in this category) and unacceptable errors.

With regard to the importance of a “just culture” you may want to refer back to our post of August 3, 2009 where we highlight a book of that title by Sidney Dekker. In that post we emphasize the need to balance accountability and learning in the investigation of the causes of errors. Both advocates of a just culture, Reason and Dekker, are from European countries and their work may not be as well known in the U.S. nuclear industry but it appears to contain valuable lessons for us.

Wednesday, March 17, 2010

Dr. James Reason on Error Management

This podcast is excerpted from an interview of Dr. James Reason in association with VoiceMap (www.voicemap.net), a provider of live guidance applications to improve human performance.  In this first segment, Dr. Reason discusses his theory of how errors occur (person based and system based) including the existence of "error traps" within an organizational system.  Error traps are evident when different people make the same error, indicating some defect in the management system, such as something as simple as bad or ambiguous procedures.  

I believe error traps may also exist due to a more intangible conditions such as conflicting priorities or requirements on staff that may create a bias toward compromise of safety priorities.  The conditions act as a trap or decision "box" where safety compromise is either viewed as "okay" or the only viable response and even well intentioned people can be subverted.  In contrast, the competing priorities may just appear to be boxes and allow a lax person to compromise safety.  The bias toward compromising safety may actually originate in people who are predisposed to making the error, making it not a system-based error trap but a personal performance error.  How should the errors in reporting at Vermont Yankee be characterized?

Tuesday, March 16, 2010

Safety Culture Briefing of NRC Commissioners March 30, 2010

There is a briefing scheduled of the Nuclear Regulatory Commissioners on safety culture on March 30, 2010.  It will be webcast at 9:30 am Eastern time.

Additional information is available here and here.

Monday, March 15, 2010

Vermont Yankee (part 2) - What Would Reason Say?

The "Reason" in the title refers to Dr. James Reason, Professor Emeritus, Department of Psychology, University of Manchester.

“It is clear from in-depth accident analyses that some of the most powerful pushes towards local traps [characteristics of the workplace that lead people to compromise safety priorities] come from an unsatisfactory resolution of the inevitable conflict that exists (at least in the short-term) between the goals of safety and production. The cultural accommodation between the pursuit of these goals must achieve a delicate balance. On the one hand, we have to face the fact that no organization is just in the business of being safe.  Every company must obey both the ' ALARP ' principle (keep the risks as low as reasonably practicable) and the 'ASSIB' principle (and still stay in business). On the other hand, it is now increasingly clear that few organizations can survive a catastrophic organizational accident (Reason 1997).”

"Achieving a Safe Culture: Theory and Practice." (1998), p. 301.
 

Dr. Reason has been a leading and influential thinker in the area of safety and risk management in the workplace and the creation of safety culture in high risk industries.  Get to know Dr. Reason through his own words in future blog posts featuring some of his key insights.

Friday, March 12, 2010

More Drips on the BP Front

In an article dated March 9, 2010 (“BP Faces Fine Over Safety at Ohio Refinery”) the Wall Street Journal reports on more heavy fines of oil giant BP for safety issues at its refineries. OSHA has levied fines of $3 million for violations at the BP refinery in Toledo, Ohio. This follows record monetary penalties for its Texas refinery last year.

What is significant about this particular enforcement action? Principally the context of the penalties - the Obama administration is taking a tougher regulatory line and its impact may extend more broadly, say to the nuclear industry. The White House clearly "wants some of the regulatory bodies to be stronger than they have been in the past," accordingly to the article. It is hard to predict what this portends for nuclear operations, but in an environment where safety lapses are piling up (Toyota et al) the NRC may feel impelled to take aggressive actions. The initial steps being taken with regard to Vermont Yankee would be consistent with such a posture.


The other noteworthy item was the observation that BP’s refining business “is already under pressure from plummeting profit margins and weak demand for petroleum products...” Sometimes the presence of significant economic pressures is the elephant in the room that is not talked about explicitly. Companies assert that safety is the highest priority yet safety problems occur that fundamentally challenge that assertion. Why? Are business pressures trumping the safety priority? Do we need to be more open about the reality of competing priorities that a business must address at the same time it meets high safety standards? Stay tuned.

Wednesday, March 10, 2010

"Normalization of a Deviation"

These are the words of John Carlin, Vice President at the Ginna Nuclear Plant, referring to a situation in the past where chronic water leakages from the reactor refueling pit were tolerated by the plant’s former owners. 

The quote is from a piece reported by Energy & Environment Publishing’s Peter Behr in its ClimateWire online publication titled, “Aging Reactors Put Nuclear Power Plant ‘Safety Culture’ in the Spotlight” and also published in The New York Times.  The focus is on a series of incidents with safety culture implications that have occurred at the Nine Mile Point and Ginna plants now owned and operated by Constellation Energy.

The recitation of events and the responses of managers and regulators are very familiar.  The drip, drip, drip is not the sound of water leaking but the uninspired give and take of the safety culture dialogue that occurs each time there is an incident or series of incidents that suggest safety culture is not working as it should.

Managers admit they need to adopt a questioning attitude and improve the rigor of decision making; ensure they have the right “mindset”; and corporate promises “a campaign to make sure its employees across the company buy into the need for an exacting attention to safety.”  Regulators remind the licensee, "The nuclear industry remains ... just one incident away from retrenchment..." but must be wondering why these events are occurring when NRC performance indicators for the plants and INPO rankings do not indicate problems.  Pledges to improve safety culture are put forth earnestly and (I believe) in good faith.

The drip, drip, drip of safety culture failures may not be cause for outright alarm or questioning of the fundamental safety of nuclear operations, but it does highlight what seems to be a condition of safety culture stasis - a standoff of sorts where significant progress has been made but problems continue to arise, and the same palliatives are applied.  Perhaps more significantly, where continued evolution of thinking regarding safety culture has plateaued.  Peaking too early is a problem in politics and sports, and so it appears in nuclear safety culture.

This is why the remark by John Carlin was so refreshing.  For those not familiar with the context of his words, “normalization of deviation” is a concept developed by Diane Vaughan in her exceptional study of the space shuttle Challenger accident.  Readers of this blog will recall that we are fans her book, The Challenger Launch Decision, where a mechanism she identifies as “normalization of deviance” is used to explain the gradual acceptance of performance results that are outside normal acceptance criteria.  Most scary, an organization's standards can decay and no one even notices.  How this occurs and what can be done about it are concepts that should be central to current considerations of safety culture. 

For further thoughts from our blog on this subject, refer to our posts dated October 6, 2009 and November 12, 2009.  In the latter, we discuss the nature of complacency and its insidious impact on the very process that is designed to avoid it in the first place.

Sunday, March 7, 2010

"What Looks Like a People Problem Often is a Situation Problem"

The title quote is taken from a new book on the market titled, Switch: How to Change Things When Change is Hard [p.3]. It forms one of the basic tenets of the authors approach to change dynamics and says, change is not just about people and their willingness or ability to change. Often, the situation (environment, context, climate, etc.) exerts powerful forces on people making it much more difficult (or easier) to change. Perhaps I am inclined favorably to their thesis in that I have argued that much of the challenge associated with sustaining strong safety culture may be rooted in the competing priorities and pressures on nuclear professionals. Safety in the abstract is easy. And it is easy to state safety is the highest priority. But nuclear safety is not an unambiguous state; it may appear in any of a thousand shades of gray that require considerable insight and judgment to assess.

By the way, what is the situation of nuclear professionals that may influence their ability to achieve safety culture norms? Budgets, staffing, and plant material condition are obvious aspects of the situation that must be considered. Efficient and effective processes - for example the CAP program or safety concerns program - also provide support (or undermine) the ability of individuals to actualize their safety priorities.

Add to that the consequences of many safety decisions in terms of dollars and/or plant operations, or a pending license action, and safety takes on fuzzy like characteristics.

Wednesday, March 3, 2010

Vermont Yankee (part 1)

This week saw a very significant development in the nuclear industry and the potential consequences of safety culture issues on a specific plant, a large nuclear enterprise and the industry. As has been widely reported (see link below) the Vermont Senate voted against the extension of the Vermont Yankee nuclear plant’s operating license. In part it appears this action stemmed from the recent leakages of tritium at the plant site but perhaps more significantly, from how the matter was handled by the plant owner, Entergy. In response to allegations that Entergy may have supplied contradictory or misleading information, Entergy engaged the law firm of Morgan Lewis and Bockius LLP to undertake an independent review of the matter. Entergy subsequently has taken administrative actions on 11 employees.

The fallout of these events has not only put into question the future of the Vermont Yankee plant, triggered the interest of the NRC and a requirement that Entergy officials testify under oath, it may also have consequences for Entergy’s plans to spin-off six of its nuclear plants into an independent subsidiary. This restructuring has been a major part of Entergy’s plans for its nuclear business and Entergy has announced that it will be evaluating its options at an upcoming meeting.

There is much to be considered as a result of the Vermont Yankee situation and we will be posting several more times on this subject. In this post we wanted to focus on some initial reaction from other quarters that we found to be off the mark. The link below to the Energy Collective includes a post from Rod Adams who appears to have done a fair bit of analysis of the facts that are currently available. His major observation is as follows:

“That said, it has become clear to me that the corporate leaders at Entergy ....never learned that taking early actions to prevent problems works a hell of a lot better than massive overreaction once it finally becomes apparent to everyone that action is required. Panic at the top never works - it destroys the confidence of the people...”

In part the author relies on the Morgan Lewis law firm’s finding that Entergy employees did not intentionally mislead Vermont regulators. However, he apparently ignores the Morgan Lewis conclusion that Entergy personnel provided responses that were, ultimately, “incomplete and misleading”.

Given the findings of the independent investigator it is hard to see what choice Entergy had, and absent additional facts, it would appear to us that the employee actions were necessary and appropriate. Adams goes on to speculate that there may even be a detrimental effect on the safety culture at the plant - due to the way Entergy is treating its employees. In reality it appears to us that any detrimental impact on safety culture would have been more likely if Entergy had not taken appropriate actions. Still, the question of how safety culture played into the failure of Entergy staff to provide unambiguous information in the first place, and how safety culture will be impacted by subsequent events is a subject that merits more detailed consideration. We will provide our thoughts in future posts.

Link to 2-25-10 Wall Street Journal news item reporting Vermont's action.
Link to Rod Adams 2-26-10 blog post.
Link to 2-24-10 Entergy Press Release on Investigation Report.