Sunday, March 7, 2010

"What Looks Like a People Problem Often is a Situation Problem"

The title quote is taken from a new book on the market titled, Switch: How to Change Things When Change is Hard [p.3]. It forms one of the basic tenets of the authors approach to change dynamics and says, change is not just about people and their willingness or ability to change. Often, the situation (environment, context, climate, etc.) exerts powerful forces on people making it much more difficult (or easier) to change. Perhaps I am inclined favorably to their thesis in that I have argued that much of the challenge associated with sustaining strong safety culture may be rooted in the competing priorities and pressures on nuclear professionals. Safety in the abstract is easy. And it is easy to state safety is the highest priority. But nuclear safety is not an unambiguous state; it may appear in any of a thousand shades of gray that require considerable insight and judgment to assess.

By the way, what is the situation of nuclear professionals that may influence their ability to achieve safety culture norms? Budgets, staffing, and plant material condition are obvious aspects of the situation that must be considered. Efficient and effective processes - for example the CAP program or safety concerns program - also provide support (or undermine) the ability of individuals to actualize their safety priorities.

Add to that the consequences of many safety decisions in terms of dollars and/or plant operations, or a pending license action, and safety takes on fuzzy like characteristics.

Wednesday, March 3, 2010

Vermont Yankee (part 1)

This week saw a very significant development in the nuclear industry and the potential consequences of safety culture issues on a specific plant, a large nuclear enterprise and the industry. As has been widely reported (see link below) the Vermont Senate voted against the extension of the Vermont Yankee nuclear plant’s operating license. In part it appears this action stemmed from the recent leakages of tritium at the plant site but perhaps more significantly, from how the matter was handled by the plant owner, Entergy. In response to allegations that Entergy may have supplied contradictory or misleading information, Entergy engaged the law firm of Morgan Lewis and Bockius LLP to undertake an independent review of the matter. Entergy subsequently has taken administrative actions on 11 employees.

The fallout of these events has not only put into question the future of the Vermont Yankee plant, triggered the interest of the NRC and a requirement that Entergy officials testify under oath, it may also have consequences for Entergy’s plans to spin-off six of its nuclear plants into an independent subsidiary. This restructuring has been a major part of Entergy’s plans for its nuclear business and Entergy has announced that it will be evaluating its options at an upcoming meeting.

There is much to be considered as a result of the Vermont Yankee situation and we will be posting several more times on this subject. In this post we wanted to focus on some initial reaction from other quarters that we found to be off the mark. The link below to the Energy Collective includes a post from Rod Adams who appears to have done a fair bit of analysis of the facts that are currently available. His major observation is as follows:

“That said, it has become clear to me that the corporate leaders at Entergy ....never learned that taking early actions to prevent problems works a hell of a lot better than massive overreaction once it finally becomes apparent to everyone that action is required. Panic at the top never works - it destroys the confidence of the people...”

In part the author relies on the Morgan Lewis law firm’s finding that Entergy employees did not intentionally mislead Vermont regulators. However, he apparently ignores the Morgan Lewis conclusion that Entergy personnel provided responses that were, ultimately, “incomplete and misleading”.

Given the findings of the independent investigator it is hard to see what choice Entergy had, and absent additional facts, it would appear to us that the employee actions were necessary and appropriate. Adams goes on to speculate that there may even be a detrimental effect on the safety culture at the plant - due to the way Entergy is treating its employees. In reality it appears to us that any detrimental impact on safety culture would have been more likely if Entergy had not taken appropriate actions. Still, the question of how safety culture played into the failure of Entergy staff to provide unambiguous information in the first place, and how safety culture will be impacted by subsequent events is a subject that merits more detailed consideration. We will provide our thoughts in future posts.

Link to 2-25-10 Wall Street Journal news item reporting Vermont's action.
Link to Rod Adams 2-26-10 blog post.
Link to 2-24-10 Entergy Press Release on Investigation Report.

Friday, February 26, 2010

"SimCity Baghdad" is the title of an intriguing article from The Atlantic magazine (Jan/Feb 2010 issue), and available via the link below.  It describes a new computer game called UrbanSim that allows U.S. Army officers to train in counterinsurgency tactics that are being implemented in Iraq.  As is often the case, the military is at the forefront of applying innovative tools such as simulation to meet new challenges.

As is the case with many simulation applications, a significant impetus for their use is that they are inexpensive and allow people to develop skills without being directly exposed to the consequences of their actions.  As pointed out in the article, “....computer games are cheap and can be played anywhere. And because the students all run the same scenarios, they can compare the efficacy of different approaches.”

The simulation was developed with significant input from soldiers who had returned from Iraq.  It is a type of simulation referred to as “agent-based simulation” that can be quite useful in portraying the dynamics of groups.  The game’s characters are modeled as autonomous agents that react not just to specific actions, but to the climate created by a player’s overall strategy.

These types of simulations are not intended to be predictive tools or to teach a specified series of actions in response to given situations.  “Rather, the intent is to teach commanders new ways of thinking about multiple problems in a fast-changing environment, always reevaluating instead of fixating on one approach...You have to think through the cause and effect of your decisions...”

Many of the benefits of simulation described in this case are the same as for managing nuclear safety culture.  As we pointed out in a prior blog post (“Social Licking”, October 6, 2009) , building and sustaining cultural norms can be significantly influenced by networked relationships - as in a nuclear plant organization - and that individuals are likely to model their behaviors based on the network.  The challenge of course is that the modeled behaviors need to be those that support safety culture.

 
Link to article.

Thursday, November 12, 2009

What is Italian for Complacency?

It is “compiacimento”.  Why would I be providing this bit of knowledge?  The reason is a recent speech by Commissioner Dale Klein of the NRC to a conference in Rome, Italy.  What I found interesting was that once again an NRC Commissioner was sounding the warning about complacency as a latent flaw that can undermine nuclear safety.  We have written a number of posts on this blog on the subject and continue to emphasize it as otherwise . . . . we would be complacent.


One of the direct quotes from the speech is, “Complacency is the primary enemy of an effective regulatory program.“  Klein goes on to recount how both the NRC and the industry had grown complacent prior to the TMI accident.  As he said it, “success breeds complacency”.


The complacency issue is the takeoff point for Klein to link complacency with safety culture.  His point being that a healthy safety culture, one that questions and challenges, is an antidote to complacency.  We agree to a point.  But a question that we have asked and try to address in our safety management models is, what if complacency in fact erodes safety culture?  Has that not been observed in more recent safety failures such as the space shuttle accidents?  To us that is the insidious nature of complacency - it can undermine the very process designed to avoid complacency in the first place.  From a systems perspective it is particularly interesting (or troubling) because as compla-cency erodes safety culture, results are challenged less and become more acceptable, further reinforcing the sense that everything is OK, and leading to more complacency.  It is referred to as a positive reinforcing loop.  Positive reinforcing loops have the ability to change system performance very rapidly, meaning an organization can go from success to failure faster than other mechanisms (e.g., safety culture surveys) may be able to detect.

Link to speech.

Tuesday, October 13, 2009

NRC Safety Culture Initiatives

The link below is to a September 29, 2009 speech by Chairman Jaczko of the Nuclear Regulatory Commission, outlining the NRC’s current initiative regarding safety culture and safety conscious work environment.  There is no big news in the speech, mostly it is notable for the continuing focus on safety culture issues at the highest level of the agency.

Perhaps of some significance is that almost all of Jaczko’s comments regard initiatives by the NRC on safety culture.  Not surprising in one sense in that it would be a logical focus for the NRC Chairman.  However I thought that the absence of industry-wide actions, perhaps covering all plants, could be perceived as a weakness.  Jaczko mentions that “We have seen an increasing number of licensees conducting periodic safety culture self-assessments…”, but that may only tend to highlight that each nuclear plant is going its own way.  True?  If so, will that encourage the NRC to define additional regulatory policies to bring greater uniformity?

Link to speech.

Tuesday, October 6, 2009

Social Licking?

The linked file contains a book review with some interesting social science that could be of great relevance to building and sustaining safety cultures.  But I couldn’t resist the best quote of the review, commenting about some of the unusual findings in recent studies of social networks.  To wit,

“In fact, the model that best predicted the network structure of U.S. senators was that of social licking among cows.”

Back on topic, the book is Connected by Nicholas Christakis and James Fowler, addressing the surprising power of social networks and how they shape our lives.  The authors may be best known for a study published several years ago about how obesity could be contagious.  It is based on observations of networked relationships – friends and friends of friends – that can lead to individuals modeling their behaviors based on those to whom they are connected.

“What is the mechanism whereby your friend’s friend’s obesity is likely to make you fatter? Partly, it’s a kind of peer pressure, or norming, effect, in which certain behaviors, or the social acceptance of certain behaviors,
get transmitted across a network of acquaintances.”  Sounds an awful lot like how we think of safety culture being spread across an organization.  For those of you who have been reading this blog, you may recall that we are fans of Diane Vaughan’s book The Challenger Launch Decision, where a mechanism she identifies as “normalization of deviance” is used to explain the gradual acceptance of performance results that are outside normal acceptance criteria.  An organization's standards decay and no one even notices.


The book review goes on to note, “Mathematical models of flocks of birds, or colonies of ants, or schools of fish reveal that while there is no central controlling director telling the birds to fly one direction or another, a collective intelligence somehow emerges, so that all the birds fly in the same direction at the same time.  Christakis and Fowler argue that through network science we are discovering the same principle at work in humans — as individuals, we are part of a superorganism, a hivelike network that shapes our decisions.”  I guess the key is to ensure that the hive takes the workers in the right direction.

Question:  Does the above observation that “there is no central controlling director” telling the right direction have implications for nuclear safety management?  Is leadership the key or development of a collective intelligence?

 
Link to review.
 

Thursday, September 24, 2009

“Culture isn’t just one aspect of the game. It is the game.”

The quote is from Lou Gerstner, retired Chairman of IBM, and appears in an interesting presentation by the management team at Wolf Creek Nuclear Operating Company.  In it they put forth their perspectives on addressing culture change within their organization.  There are many good points in the presentation and several I would like to specifically highlight.

First, the issue of trust is addressed on several slides.  For example, on the Engaged Employees slide (p. 24) it is noted that training in building trust had been initiated and would be ongoing.  A later slide, Effective Leadership Team (p. 31), notes that there was increased trust at the station.  In our thinking about safety management, and specifically in our simulation modeling, we include trust as a key variable and driver of safety culture.  Trust is a subjective ingredient but its importance is real.  We think there are at least two mechanisms for building trust within an organization.  One is through the type of initiatives described in the slides – direct attention and training in creating trust within the management team and staff.  A second mechanism that perhaps does not receive as much recognition is the indirect impact of decisions and actions taken by the organization and the extent to which they model desired safety values.  This second mechanism is very powerful as it reflects reality.  If reality comports with the espoused values, it reinforces the values and builds trust.  If reality is contra to the values, it will undermine any amount of training or pronouncements about trust.

The second point to be highlighted is addressed on the Culture slide in the Epilogue section (p.35).  There it is noted that as an industry we are good at defining the desired behaviors, but we are not good at defining how to achieve a culture where most people practice those behaviors.  We think there is a lot of truth in this and the “how” aspect of building and maintaining a robust safety culture is something that merits more attention.  “Practicing” those behaviors is the subject of our white paper, “Practicing Nuclear Safety Management.”

Link to presentation.