Tuesday, March 16, 2010
Safety Culture Briefing of NRC Commissioners March 30, 2010
Friday, March 12, 2010
More Drips on the BP Front
What is significant about this particular enforcement action? Principally the context of the penalties - the Obama administration is taking a tougher regulatory line and its impact may extend more broadly, say to the nuclear industry. The White House clearly "wants some of the regulatory bodies to be stronger than they have been in the past," accordingly to the article. It is hard to predict what this portends for nuclear operations, but in an environment where safety lapses are piling up (Toyota et al) the NRC may feel impelled to take aggressive actions. The initial steps being taken with regard to Vermont Yankee would be consistent with such a posture.
The other noteworthy item was the observation that BP’s refining business “is already under pressure from plummeting profit margins and weak demand for petroleum products...” Sometimes the presence of significant economic pressures is the elephant in the room that is not talked about explicitly. Companies assert that safety is the highest priority yet safety problems occur that fundamentally challenge that assertion. Why? Are business pressures trumping the safety priority? Do we need to be more open about the reality of competing priorities that a business must address at the same time it meets high safety standards? Stay tuned.
Wednesday, March 10, 2010
"Normalization of a Deviation"
The quote is from a piece reported by Energy & Environment Publishing’s Peter Behr in its ClimateWire online publication titled, “Aging Reactors Put Nuclear Power Plant ‘Safety Culture’ in the Spotlight” and also published in The New York Times. The focus is on a series of incidents with safety culture implications that have occurred at the Nine Mile Point and Ginna plants now owned and operated by Constellation Energy.
The recitation of events and the responses of managers and regulators are very familiar. The drip, drip, drip is not the sound of water leaking but the uninspired give and take of the safety culture dialogue that occurs each time there is an incident or series of incidents that suggest safety culture is not working as it should.
Managers admit they need to adopt a questioning attitude and improve the rigor of decision making; ensure they have the right “mindset”; and corporate promises “a campaign to make sure its employees across the company buy into the need for an exacting attention to safety.” Regulators remind the licensee, "The nuclear industry remains ... just one incident away from retrenchment..." but must be wondering why these events are occurring when NRC performance indicators for the plants and INPO rankings do not indicate problems. Pledges to improve safety culture are put forth earnestly and (I believe) in good faith.
The drip, drip, drip of safety culture failures may not be cause for outright alarm or questioning of the fundamental safety of nuclear operations, but it does highlight what seems to be a condition of safety culture stasis - a standoff of sorts where significant progress has been made but problems continue to arise, and the same palliatives are applied. Perhaps more significantly, where continued evolution of thinking regarding safety culture has plateaued. Peaking too early is a problem in politics and sports, and so it appears in nuclear safety culture.
This is why the remark by John Carlin was so refreshing. For those not familiar with the context of his words, “normalization of deviation” is a concept developed by Diane Vaughan in her exceptional study of the space shuttle Challenger accident. Readers of this blog will recall that we are fans her book, The Challenger Launch Decision, where a mechanism she identifies as “normalization of deviance” is used to explain the gradual acceptance of performance results that are outside normal acceptance criteria. Most scary, an organization's standards can decay and no one even notices. How this occurs and what can be done about it are concepts that should be central to current considerations of safety culture.
For further thoughts from our blog on this subject, refer to our posts dated October 6, 2009 and November 12, 2009. In the latter, we discuss the nature of complacency and its insidious impact on the very process that is designed to avoid it in the first place.
Sunday, March 7, 2010
"What Looks Like a People Problem Often is a Situation Problem"
By the way, what is the situation of nuclear professionals that may influence their ability to achieve safety culture norms? Budgets, staffing, and plant material condition are obvious aspects of the situation that must be considered. Efficient and effective processes - for example the CAP program or safety concerns program - also provide support (or undermine) the ability of individuals to actualize their safety priorities.
Add to that the consequences of many safety decisions in terms of dollars and/or plant operations, or a pending license action, and safety takes on fuzzy like characteristics.
Wednesday, March 3, 2010
Vermont Yankee (part 1)
The fallout of these events has not only put into question the future of the Vermont Yankee plant, triggered the interest of the NRC and a requirement that Entergy officials testify under oath, it may also have consequences for Entergy’s plans to spin-off six of its nuclear plants into an independent subsidiary. This restructuring has been a major part of Entergy’s plans for its nuclear business and Entergy has announced that it will be evaluating its options at an upcoming meeting.
There is much to be considered as a result of the Vermont Yankee situation and we will be posting several more times on this subject. In this post we wanted to focus on some initial reaction from other quarters that we found to be off the mark. The link below to the Energy Collective includes a post from Rod Adams who appears to have done a fair bit of analysis of the facts that are currently available. His major observation is as follows:
“That said, it has become clear to me that the corporate leaders at Entergy ....never learned that taking early actions to prevent problems works a hell of a lot better than massive overreaction once it finally becomes apparent to everyone that action is required. Panic at the top never works - it destroys the confidence of the people...”
In part the author relies on the Morgan Lewis law firm’s finding that Entergy employees did not intentionally mislead Vermont regulators. However, he apparently ignores the Morgan Lewis conclusion that Entergy personnel provided responses that were, ultimately, “incomplete and misleading”.
Given the findings of the independent investigator it is hard to see what choice Entergy had, and absent additional facts, it would appear to us that the employee actions were necessary and appropriate. Adams goes on to speculate that there may even be a detrimental effect on the safety culture at the plant - due to the way Entergy is treating its employees. In reality it appears to us that any detrimental impact on safety culture would have been more likely if Entergy had not taken appropriate actions. Still, the question of how safety culture played into the failure of Entergy staff to provide unambiguous information in the first place, and how safety culture will be impacted by subsequent events is a subject that merits more detailed consideration. We will provide our thoughts in future posts.
Link to 2-25-10 Wall Street Journal news item reporting Vermont's action.
Link to Rod Adams 2-26-10 blog post.
Link to 2-24-10 Entergy Press Release on Investigation Report.
Friday, February 26, 2010
As is the case with many simulation applications, a significant impetus for their use is that they are inexpensive and allow people to develop skills without being directly exposed to the consequences of their actions. As pointed out in the article, “....computer games are cheap and can be played anywhere. And because the students all run the same scenarios, they can compare the efficacy of different approaches.”
The simulation was developed with significant input from soldiers who had returned from Iraq. It is a type of simulation referred to as “agent-based simulation” that can be quite useful in portraying the dynamics of groups. The game’s characters are modeled as autonomous agents that react not just to specific actions, but to the climate created by a player’s overall strategy.
These types of simulations are not intended to be predictive tools or to teach a specified series of actions in response to given situations. “Rather, the intent is to teach commanders new ways of thinking about multiple problems in a fast-changing environment, always reevaluating instead of fixating on one approach...You have to think through the cause and effect of your decisions...”
Many of the benefits of simulation described in this case are the same as for managing nuclear safety culture. As we pointed out in a prior blog post (“Social Licking”, October 6, 2009) , building and sustaining cultural norms can be significantly influenced by networked relationships - as in a nuclear plant organization - and that individuals are likely to model their behaviors based on the network. The challenge of course is that the modeled behaviors need to be those that support safety culture.
Link to article.
Thursday, November 12, 2009
What is Italian for Complacency?
One of the direct quotes from the speech is, “Complacency is the primary enemy of an effective regulatory program.“ Klein goes on to recount how both the NRC and the industry had grown complacent prior to the TMI accident. As he said it, “success breeds complacency”.
The complacency issue is the takeoff point for Klein to link complacency with safety culture. His point being that a healthy safety culture, one that questions and challenges, is an antidote to complacency. We agree to a point. But a question that we have asked and try to address in our safety management models is, what if complacency in fact erodes safety culture? Has that not been observed in more recent safety failures such as the space shuttle accidents? To us that is the insidious nature of complacency - it can undermine the very process designed to avoid complacency in the first place. From a systems perspective it is particularly interesting (or troubling) because as compla-cency erodes safety culture, results are challenged less and become more acceptable, further reinforcing the sense that everything is OK, and leading to more complacency. It is referred to as a positive reinforcing loop. Positive reinforcing loops have the ability to change system performance very rapidly, meaning an organization can go from success to failure faster than other mechanisms (e.g., safety culture surveys) may be able to detect.
Link to speech.
Tuesday, October 13, 2009
NRC Safety Culture Initiatives
Perhaps of some significance is that almost all of Jaczko’s comments regard initiatives by the NRC on safety culture. Not surprising in one sense in that it would be a logical focus for the NRC Chairman. However I thought that the absence of industry-wide actions, perhaps covering all plants, could be perceived as a weakness. Jaczko mentions that “We have seen an increasing number of licensees conducting periodic safety culture self-assessments…”, but that may only tend to highlight that each nuclear plant is going its own way. True? If so, will that encourage the NRC to define additional regulatory policies to bring greater uniformity?
Link to speech.
Tuesday, October 6, 2009
Social Licking?
“In fact, the model that best predicted the network structure of U.S. senators was that of social licking among cows.”
Back on topic, the book is Connected by Nicholas Christakis and James Fowler, addressing the surprising power of social networks and how they shape our lives. The authors may be best known for a study published several years ago about how obesity could be contagious. It is based on observations of networked relationships – friends and friends of friends – that can lead to individuals modeling their behaviors based on those to whom they are connected.
“What is the mechanism whereby your friend’s friend’s obesity is likely to make you fatter? Partly, it’s a kind of peer pressure, or norming, effect, in which certain behaviors, or the social acceptance of certain behaviors,
get transmitted across a network of acquaintances.” Sounds an awful lot like how we think of safety culture being spread across an organization. For those of you who have been reading this blog, you may recall that we are fans of Diane Vaughan’s book The Challenger Launch Decision, where a mechanism she identifies as “normalization of deviance” is used to explain the gradual acceptance of performance results that are outside normal acceptance criteria. An organization's standards decay and no one even notices.
The book review goes on to note, “Mathematical models of flocks of birds, or colonies of ants, or schools of fish reveal that while there is no central controlling director telling the birds to fly one direction or another, a collective intelligence somehow emerges, so that all the birds fly in the same direction at the same time. Christakis and Fowler argue that through network science we are discovering the same principle at work in humans — as individuals, we are part of a superorganism, a hivelike network that shapes our decisions.” I guess the key is to ensure that the hive takes the workers in the right direction.
Question: Does the above observation that “there is no central controlling director” telling the right direction have implications for nuclear safety management? Is leadership the key or development of a collective intelligence?
Link to review.
Thursday, September 24, 2009
“Culture isn’t just one aspect of the game. It is the game.”
First, the issue of trust is addressed on several slides. For example, on the Engaged Employees slide (p. 24) it is noted that training in building trust had been initiated and would be ongoing. A later slide, Effective Leadership Team (p. 31), notes that there was increased trust at the station. In our thinking about safety management, and specifically in our simulation modeling, we include trust as a key variable and driver of safety culture. Trust is a subjective ingredient but its importance is real. We think there are at least two mechanisms for building trust within an organization. One is through the type of initiatives described in the slides – direct attention and training in creating trust within the management team and staff. A second mechanism that perhaps does not receive as much recognition is the indirect impact of decisions and actions taken by the organization and the extent to which they model desired safety values. This second mechanism is very powerful as it reflects reality. If reality comports with the espoused values, it reinforces the values and builds trust. If reality is contra to the values, it will undermine any amount of training or pronouncements about trust.
The second point to be highlighted is addressed on the Culture slide in the Epilogue section (p.35). There it is noted that as an industry we are good at defining the desired behaviors, but we are not good at defining how to achieve a culture where most people practice those behaviors. We think there is a lot of truth in this and the “how” aspect of building and maintaining a robust safety culture is something that merits more attention. “Practicing” those behaviors is the subject of our white paper, “Practicing Nuclear Safety Management.”
Link to presentation.
Wednesday, September 16, 2009
The Davis Besse Hole
Since 2002 Davis Besse has become synonymous with the issue of safety culture in the nuclear industry. As with many safety and regulatory issues, there are many fundamentally important reasons to comply with the NRC’s criteria and requirements. But the potential regulatory consequences of not meeting those criteria also merit some consideration. Two years shutdown, five years of escalated NRC oversight, civil penalties, prosecutions of individuals . . . . Davis Besse was the TMI of nuclear safety culture.
Thursday, September 10, 2009
Schrodinger’s Bat
Dare I put forth a sports analogy? In baseball there is a defined “strike zone”. In theory the umpire uses the strike zone to make calls of balls and strikes. But the zone is really open to interpretation in the dynamic, three dimensional world of pitching and umpiring. The reality is that the strike zone becomes the space delineated by the aggregate set of balls and strike calls by an umpire. It relies on the skill of the umpire, his understanding of the strike zone and his commitment to making accurate calls. The linked article provides some interesting data on the strike zone and the psychology of umpires' decisions.
Link to "Schrodinger’s Bat" July 26, 2007.
Tuesday, September 8, 2009
Is Safety Culture the Grand Unifying Concept?
Two of the principal contributors to LearnSafe, Björn Wahlström and Carl Rollenhagen, published some of their interpretations of the study results in a 2004 paper, link below. In the paper they state:
“The data collected in the LearnSafe project provides interesting views on some of the major issues connected to the concept of safety culture. A suggestion generated from the data is that attempts to define and measure safety culture may be counterproductive and a more fruitful approach may be to use the concept to stimulate discussions on how safety is constructed. ” [p. 2]
The contribution of the LearnSafe project comes from the empirical data developed in the surveys and discussions with over 300 nuclear managers. It was found that the term safety culture was not frequently mentioned as a challenge for managing nuclear plants. Instead, much more frequently mentioned were factors that are commonly understood to be part of safety culture. Wahlström and Rollenhagen observe, “This would suggest the interpretation is that safety culture is not a concept for itself, but it is instead ingrained in various aspects of the management activities.” [p. 6]
This observation leads to the question of whether it is useful to put forward safety culture as a top level concept that somehow is responsible for or “produces” safety. Or would it be better to think of it as an organic process that continuously evolves and develops within an organization. This perspective would say that safety culture is more the product of the myriad of decisions and interactions that occur within an organization rather than some set of intrinsic values that is the determinant of those decisions.
Link to paper.
Tuesday, September 1, 2009
EdF Faces Conflicting Pressures
Link to article.
Friday, August 28, 2009
Bernhard Wilpert
Professor Wilpert emphasized the interaction of human, technology, and organizational dynamics. His tools for human factors event analysis have become the standard practice in German and Swiss nuclear plants. He is the author of several leading books including Safety Culture in Nuclear Power Operations; System Safety: Challenges; Pitfalls of Intervention; Emerging Demands for Nuclear Safety of Nuclear Power Operations: Challenge and Response; and Nuclear Safety: A Human Factors Perspective.
Professor Wilpert was also a principal contributor to the LearnSafe project conducted in Europe from 2001 – 2004. See the following link for information about the project team and its results and look to us for future posts on the LearnSafe research.
Link to LearnSafe project.
Monday, August 17, 2009
Safety Culture Assessment
One question that frequently comes to mind is, can safety culture be separated from the manifestation of culture in terms of the specific actions and decisions taken by an organization? For example, if an organization makes some decisions that are clearly at odds with “safety being the overriding priority”, can the culture of the organization not be deficient? But if an assessment of the culture is performed, and the espoused beliefs and priorities are generally supportive of safety, what is to be made of those responses?
The reference material for this post comes from some work led by the late Bernhard Wilpert of the Berlin University of Technology. (We will sample a variety of his work in the safety management area in future posts.) It is a brief slide presentation titled, “Challenges and Opportunities of Assessing Safety Culture”. Slide 3 for example revisits E. H. Schein’s multi-dimensional formulation of safety culture which suggests that assessments must be able to expose all levels of culture and their integrated effect.
Two observations from these slides seem of particular note. They are both under Item 4, Methodological Challenges. The first observation is that culture is not a quantifiable phenomenon and does not lend itself easily to benchmarking. This bears consideration as most assessment methods being used today employ some statistical comparisons to assessments at other plants, including percentile type ranking. The other observation in the slide is that culture results from the learning experience of its members. This is of particular interest to us as it supports some of the thinking associated with a systems dynamics approach. A systems view involves the development of shared “mental models” of how safety management “works”; the goal being that individual actions and decisions can be understood within a commonly understood framework. The systems process becomes, in essence, the mechanism for translating beliefs into actions.
Link to slide presentation.
Thursday, August 13, 2009
Primer on System Dynamics
System Dynamics is a concept for seeing the world in terms of inputs and outputs, where internal feedback loops and time delays can affect system behavior and lead to complex, non-linear changes in system performance.
The System Dynamics worldview was originally developed by Prof. Jay Forrester at MIT. Later work by other thinkers, e.g., Peter Senge, author of The Fifth Discipline, expanded the original concepts and made them available to a broader audience. An overview of System Dynamics can be found on Wikipedia.
Our NuclearSafetySim program uses System Dynamics to model managerial behavior in an environment where maintaining the nuclear safety culture is a critical element. NuclearSafetySim is built using isee Systems iThink software. isee Systems has educational materials available on their website that explain some basic concepts.
Thursday, August 6, 2009
Signs of a Reactive Organization (MIT #6)
The figure below illustrates how the number of problems/issues (we use the generic term "challenges" in NuclearSafetySim) might vary with time when the response is reactive. The blue line indicates the total number of issues, the pink line the number of new issues being identified and the green line, the resolution rate for issues, e.g., through a corrective action program. Note that the blue line initially increases and then oscillates while the pink line is relatively constant. The oscillation derives from the management response, reflected in the green line, where there is an initial delay in responding to an increased numbers of issues, then resolution rates are greatly increased to address higher backlogs, then reduced (due to budgetary pressures and other priorities) when backlogs start to fall, precipitating another cycle of increasing issues.
Compare the oscillatory response above to the next figure where an increase in issues results immediately in higher resolution rates that are maintained over a period sufficient to return the system to a lower level of backlogs. In parallel, budgets are increased to address the underlying causes of issues, driving down the occurrence rate of new issues and ultimately bringing backlog down to a long-term sustainable level.
The last figure shows some of the ramifications of system management on safety culture and employee trust. The significant increase in issues backlog initially leads to a degradation of employee trust (the pink line) and an erosion in safety culture (blue line). However the nature and effectiveness of the management response in bringing down backlogs and reducing new issues reverses the trust trend line and rebuilds safety culture over time. Note the red line, representing plant performance, is relatively unchanged over the same period indicating that performance issues may exist under the cover of a consistently operating plant.
Monday, August 3, 2009
Reading List: Just Culture by Sidney Dekker
Question for nuclear professionals: Does your organization maintain a library of resources such as Just Culture or Dianne Vaughan’s book, The Challenger Launch Decision, that provide deep insights into organizational performance and culture? Are materials like this routinely the subject of discussions in training sessions and topical meetings?
Thursday, July 30, 2009
“Reliability is a Dynamic Non-Event” (MIT #5)
What does this imply about the nuclear industry? Certainly we are in a period where the reliability of the plants is at a very high level and the NRC ROP indicator board is very green. Is this positive for maintaining high safety culture levels or does it represent a potential threat? It could be the latter since the biggest problem in addressing the safety implications of complacency in an organization is, well, complacency.