Thursday, November 12, 2009

What is Italian for Complacency?

It is “compiacimento”.  Why would I be providing this bit of knowledge?  The reason is a recent speech by Commissioner Dale Klein of the NRC to a conference in Rome, Italy.  What I found interesting was that once again an NRC Commissioner was sounding the warning about complacency as a latent flaw that can undermine nuclear safety.  We have written a number of posts on this blog on the subject and continue to emphasize it as otherwise . . . . we would be complacent.


One of the direct quotes from the speech is, “Complacency is the primary enemy of an effective regulatory program.“  Klein goes on to recount how both the NRC and the industry had grown complacent prior to the TMI accident.  As he said it, “success breeds complacency”.


The complacency issue is the takeoff point for Klein to link complacency with safety culture.  His point being that a healthy safety culture, one that questions and challenges, is an antidote to complacency.  We agree to a point.  But a question that we have asked and try to address in our safety management models is, what if complacency in fact erodes safety culture?  Has that not been observed in more recent safety failures such as the space shuttle accidents?  To us that is the insidious nature of complacency - it can undermine the very process designed to avoid complacency in the first place.  From a systems perspective it is particularly interesting (or troubling) because as compla-cency erodes safety culture, results are challenged less and become more acceptable, further reinforcing the sense that everything is OK, and leading to more complacency.  It is referred to as a positive reinforcing loop.  Positive reinforcing loops have the ability to change system performance very rapidly, meaning an organization can go from success to failure faster than other mechanisms (e.g., safety culture surveys) may be able to detect.

Link to speech.

Tuesday, October 13, 2009

NRC Safety Culture Initiatives

The link below is to a September 29, 2009 speech by Chairman Jaczko of the Nuclear Regulatory Commission, outlining the NRC’s current initiative regarding safety culture and safety conscious work environment.  There is no big news in the speech, mostly it is notable for the continuing focus on safety culture issues at the highest level of the agency.

Perhaps of some significance is that almost all of Jaczko’s comments regard initiatives by the NRC on safety culture.  Not surprising in one sense in that it would be a logical focus for the NRC Chairman.  However I thought that the absence of industry-wide actions, perhaps covering all plants, could be perceived as a weakness.  Jaczko mentions that “We have seen an increasing number of licensees conducting periodic safety culture self-assessments…”, but that may only tend to highlight that each nuclear plant is going its own way.  True?  If so, will that encourage the NRC to define additional regulatory policies to bring greater uniformity?

Link to speech.

Tuesday, October 6, 2009

Social Licking?

The linked file contains a book review with some interesting social science that could be of great relevance to building and sustaining safety cultures.  But I couldn’t resist the best quote of the review, commenting about some of the unusual findings in recent studies of social networks.  To wit,

“In fact, the model that best predicted the network structure of U.S. senators was that of social licking among cows.”

Back on topic, the book is Connected by Nicholas Christakis and James Fowler, addressing the surprising power of social networks and how they shape our lives.  The authors may be best known for a study published several years ago about how obesity could be contagious.  It is based on observations of networked relationships – friends and friends of friends – that can lead to individuals modeling their behaviors based on those to whom they are connected.

“What is the mechanism whereby your friend’s friend’s obesity is likely to make you fatter? Partly, it’s a kind of peer pressure, or norming, effect, in which certain behaviors, or the social acceptance of certain behaviors,
get transmitted across a network of acquaintances.”  Sounds an awful lot like how we think of safety culture being spread across an organization.  For those of you who have been reading this blog, you may recall that we are fans of Diane Vaughan’s book The Challenger Launch Decision, where a mechanism she identifies as “normalization of deviance” is used to explain the gradual acceptance of performance results that are outside normal acceptance criteria.  An organization's standards decay and no one even notices.


The book review goes on to note, “Mathematical models of flocks of birds, or colonies of ants, or schools of fish reveal that while there is no central controlling director telling the birds to fly one direction or another, a collective intelligence somehow emerges, so that all the birds fly in the same direction at the same time.  Christakis and Fowler argue that through network science we are discovering the same principle at work in humans — as individuals, we are part of a superorganism, a hivelike network that shapes our decisions.”  I guess the key is to ensure that the hive takes the workers in the right direction.

Question:  Does the above observation that “there is no central controlling director” telling the right direction have implications for nuclear safety management?  Is leadership the key or development of a collective intelligence?

 
Link to review.
 

Thursday, September 24, 2009

“Culture isn’t just one aspect of the game. It is the game.”

The quote is from Lou Gerstner, retired Chairman of IBM, and appears in an interesting presentation by the management team at Wolf Creek Nuclear Operating Company.  In it they put forth their perspectives on addressing culture change within their organization.  There are many good points in the presentation and several I would like to specifically highlight.

First, the issue of trust is addressed on several slides.  For example, on the Engaged Employees slide (p. 24) it is noted that training in building trust had been initiated and would be ongoing.  A later slide, Effective Leadership Team (p. 31), notes that there was increased trust at the station.  In our thinking about safety management, and specifically in our simulation modeling, we include trust as a key variable and driver of safety culture.  Trust is a subjective ingredient but its importance is real.  We think there are at least two mechanisms for building trust within an organization.  One is through the type of initiatives described in the slides – direct attention and training in creating trust within the management team and staff.  A second mechanism that perhaps does not receive as much recognition is the indirect impact of decisions and actions taken by the organization and the extent to which they model desired safety values.  This second mechanism is very powerful as it reflects reality.  If reality comports with the espoused values, it reinforces the values and builds trust.  If reality is contra to the values, it will undermine any amount of training or pronouncements about trust.

The second point to be highlighted is addressed on the Culture slide in the Epilogue section (p.35).  There it is noted that as an industry we are good at defining the desired behaviors, but we are not good at defining how to achieve a culture where most people practice those behaviors.  We think there is a lot of truth in this and the “how” aspect of building and maintaining a robust safety culture is something that merits more attention.  “Practicing” those behaviors is the subject of our white paper, “Practicing Nuclear Safety Management.”

Link to presentation.

Friday, September 18, 2009

Air France

A recent Wall Street Journal article indicates that Air France is taking an unusual step in asking its partner airline, Delta, to help assess its safety practices.  The Air France CEO stated, “Safety is a dynamic thing, the risk is to say, ‘We’ve done our work, so let’s stop.’”  This action emanates from the June crash of an Air France flight en route from Brazil to Paris.

On the one hand it is unbelievably sad to see that an accident becomes the straw that initiates action to delve more deeply into safety issues.  But we find optimism in that the CEO recognizes the dynamic nature of safety.  We wholeheartedly agree.

Link to article.

Thursday, September 17, 2009

Quote of the Day

I came across the following in a discussion forum related to Davis Besse issues.

“So it appears that man is capable of controlling the climate, but not the atom.  God is laughing.”

While not exactly on point for SafetyMatters, it was irresistible.

Wednesday, September 16, 2009

The Davis Besse Hole

Most people will read the title of this post and think of the corrosive hole in the Davis Besse reactor vessel head discovered in 2002.  But the post actually refers to the seven year hole in regulatory space into which the plant and its organization fell as a result of the reactor vessel head incident.  Several items in our safety culture news website box cite the NRC announcement this past week that the plant was returning to normal regulatory status. 

Since 2002 Davis Besse has become synonymous with the issue of safety culture in the nuclear industry.  As with many safety and regulatory issues, there are many fundamentally important reasons to comply with the NRC’s criteria and requirements.  But the potential regulatory consequences of not meeting those criteria also merit some consideration.  Two years shutdown, five years of escalated NRC oversight, civil penalties, prosecutions of individuals . . . . Davis Besse was the TMI of nuclear safety culture.

Saturday, September 12, 2009

A LearnSafe Afterthought

The line of thinking in the Wahlström and Rollenhagen paper and the LearnSafe project appears to provide a strong nudge away from thinking of safety culture in terms of a set of beliefs and values.  Or of thinking of safety culture as something apart from the how the multiple, complex decision processes within an organization are occurring.

One could also ask, as did Wahlström and Rollenhagen, if the present interpretations of safety culture are rich enough to serve the need for a requisite variety; i.e. does the concept have the same order of complexity as the plant organization that it is supposed to control? [p.8]

One tool for representing the many factors at work in a given environment is an influence diagram.  As Wahlström and Rollenhagen note, “Influence diagrams are often used as the next step in a model building exercise to track dependencies between issues. It is relatively easy for people to identify up-stream causes and down-stream consequences of some specific issue. It is far more difficult to merge these influences to a comprehensive model of some interesting phenomenon, because there are usually very many influences to be traced. Sometimes the influences form loops, which in practice may render the influence diagram more difficult to use for making predictions of how some issue may influence another. When the influences are linear, models are relatively easy to build and validate, but many systems include influences with threshold and saturation effects.” [p. 4, emphasis added]  Multiple variables, loops, and threshold and saturation effects are all important constructs in the system dynamics world view.

Link to paper.

Thursday, September 10, 2009

Schrodinger’s Bat

This post follows on the issue of whether safety culture is a concept unto itself or a state that is defined by many constituent actions.  Some of our own thinking about safety culture in developing the nuclearsafetysim website and tools led us to prefer a focus on safety management as opposed to safety culture.  Safety management includes the key “levers” of organizational performance (e.g., resource allocation, problem idenfication and resolution, building of trust, etc.) and the integrated effect of the manipulation of these levers results in a safety culture “value” in the simulation.  Thus all the dynamics flow from actions and decisions to a safety culture resultant, not the reverse.

Dare I put forth a sports analogy?  In baseball there is a defined “strike zone”.   In theory the umpire uses the strike zone to make calls of balls and strikes.  But the zone is really open to interpretation in the dynamic, three dimensional world of pitching and umpiring.  The reality is that the strike zone becomes the space delineated by the aggregate set of balls and strike calls by an umpire.  It relies on the skill of the umpire, his understanding of the strike zone and his commitment to making accurate calls. The linked article provides some interesting data on the strike zone and the psychology of umpires' decisions.

Link to "Schrodinger’s Bat" July 26, 2007
.

Tuesday, September 8, 2009

Is Safety Culture the Grand Unifying Concept?

I thought I would use this question as an entre back into some of Professor Bernhard Wilpert’s work with what became known as the LearnSafe project.  The LearnSafe website is worth visiting for insights into this issue and a number of others.

Two of the principal contributors to LearnSafe, Björn Wahlström and Carl Rollenhagen, published some of their interpretations of the study results in a 2004 paper, link below.  In the paper they state:

“The data collected in the LearnSafe project provides interesting views on some of the major issues connected to the concept of safety culture. A suggestion generated from the data is that attempts to define and measure safety culture may be counterproductive and a more fruitful approach may be to use the concept to stimulate discussions on how safety is constructed. ” [p. 2]

The contribution of the LearnSafe project comes from the empirical data developed in the surveys and discussions with over 300 nuclear managers.  It was found that the term safety culture was not frequently mentioned as a challenge for managing nuclear plants.  Instead, much more frequently mentioned were factors that are commonly understood to be part of safety culture. Wahlström and Rollenhagen observe, “This would suggest the interpretation is that safety culture is not a concept for itself, but it is instead ingrained in various aspects of the management activities.” [p. 6] 

This observation leads to the question of whether it is useful to put forward safety culture as a top level concept that somehow is responsible for or “produces” safety.  Or would it be better to think of it as an organic process that continuously evolves and develops within an organization.  This perspective would say that safety culture is more the product of the myriad of decisions and interactions that occur within an organization rather than some set of intrinsic values that is the determinant of those decisions.

Link to paper.

Thursday, September 3, 2009

FAA Moves Away from Blame and Punishment

The Federal Aviation Administration (FAA) took another step toward a new safety culture by reducing the emphasis on blame in the reporting of operational errors by air traffic controllers.  “We’re moving away from a culture of blame and punishment,” said FAA Administrator Randy Babbitt. “It’s important to note that controllers remain accountable for their actions, but we’re moving toward a new era that focuses on why these events occur and what can be done to prevent them.” 
 
Effective immediately, the names of controllers will not be included in reports sent to FAA headquarters on operational errors…. Removing names on the official report will allow investigators to focus on what happened rather than who was at fault.

Link to FAA press release.

Wednesday, September 2, 2009

The Complacency Thing Again

Commissioner Klein’s recent address to the ANS once again hits on the complacency issue.  Read his remarks at the link below.


Link to speech.

Tuesday, September 1, 2009

EdF Faces Conflicting Pressures

As described in the linked article, workers at Electricite de France are raising concerns about conflicting pressures to work faster, achieve higher capacity factors and provide competitive electricity.  EdF has long held a very high reputation for its nuclear operations, in part attributed to the national government’s central ownership and operating responsibilities.  While it remains to be seen the extent of such concerns, it is apparent that central ownership does not provide a shield against many of the same pressures experienced by U.S. plants.  The article also highlights the potential complications of heavy reliance on subcontractors if it leads to the loss of core competencies in the host organization.


Link to article.

Friday, August 28, 2009

Bernhard Wilpert

As mentioned in a prior post we will be highlighting some of the work of the late Bernhard Wilpert, a leading figure in research on the role of human behavior in high reliability organizations. 


Professor Wilpert emphasized the interaction of human, technology, and organizational dynamics.  His tools for human factors event analysis have become the standard practice in German and Swiss nuclear plants.  He is the author of several leading books including Safety Culture in Nuclear Power Operations; System Safety: Challenges; Pitfalls of Intervention; Emerging Demands for Nuclear Safety of Nuclear Power Operations: Challenge and Response; and Nuclear Safety: A Human Factors Perspective.

Professor Wilpert was also a principal contributor to the LearnSafe project conducted in Europe from 2001 – 2004.  See the following link for information about the project team and its results and look to us for future posts on the LearnSafe research.

Link to LearnSafe project.

Wednesday, August 26, 2009

Can Assessments Identify Complacency? Can Assessments Breed Complacency?

To delve a little deeper into this question, on Slide 10 of the NEI presentation there is a typical summary graphic of assessment results.  The chart catalogs the responses of members of the organization by the eight INPO principles of safety culture.  This summary indicates a variety of responses to the individual principles – for 3 or 4 of the principles there seems to be a fairly strong consensus that the right things are happening.  But 5 of the 8 principles show greater than a 20 score negative responses and 2 of the principles show greater than a 40 score negatives. 

First, what can or should one conclude about the overall state of safety culture in this organization given these results?  One wonders if these results were shown to a number of experts, whether their interpretations would be consistent or whether they would even purport to associate the results with a finding.  As discussed in a prior post, this issue is fundamental to the nature of safety culture, whether it is amenable to direct measurement, and whether assessment results really say anything about the safety health of the organization.

But the more particular question for this post is whether an assessment can detect complacency in an organization and its potential for latent risk to the organization’s safety performance.  In a post dated July 30, 2009 I referred to the problems presented by complacency, particularly in organizations experiencing few operational challenges.  That environment can be ripe for a weak culture to develop or be sustained. Could that environment also bias the responses to assessment questions, reinforcing the incorrect perception that safety culture is healthy?  It may be that this type of situation is of most relevance in today’s nuclear industry where the vast majority of plants are operating at high capacity factors and experiencing few significant operational events.  It is not clear to this commentator that assessments can be designed to explicitly detect complacency, and even the use of assessment results in conjunction with other data (data likely to look normal when overall performance is good) may not be credible in raising an alarm.

Link to NEI presentation.

Monday, August 24, 2009

Assessment Results – A Rose is a Rose

The famous words of Gertrude Stein are most often associated with the notion that when all is said and done, a thing is what it is.  We offer this idea as we continue to look at the meaning of safety culture assessment results – are the results just the results, or do they signify some meaning or interpretation beyond the results?

To illustrate some of the issues I will use an NEI presentation made to the NRC on February 3, 2009.  On Slide 2 there is a statement that the USA methodology (for safety culture surveys and assessments) has been used successfully for five years.   One question is what does it mean that an assessment was successful?  The intent is not to pick on this particular methodology but to open the question of exactly what is the expected result of performing an assessment.

It may be that “successful” means that the organizations being assessed have found the process and results to be useful or interesting, e.g., by stimulating discussion or furthering exploration of issues associated with the results.  There are many, myself included, who believe anything that stimulates an organization to discuss and contemplate safety management issues is beneficial.  On the other hand it may be that organizations (and regulators??) believe assessments are successful because they can use the results to make a determination that a safety culture is “acceptable” or “strong” or “needs improvement”.  Can assessments really carry the weight of this expectation?  Or is a rose just a rose?

Slide 11 highlights these questions by indicating a validation of the assessment methodology is to be carried out.  “Validation” seems to suggest that assessments mean something beyond their immediate results.  It may also suggest that assessment results can be compared to some “known” value to determine whether the assessment accurately measured or predicted that value.  We will have to wait and see what is intended and how the validation is performed.  At the same time we will be keeping in mind the observation of Professor Wilpert in my post of August 17, 2009 that “culture is not a quantifiable phenomenon”.

Link to presentation
.

Monday, August 17, 2009

Safety Culture Assessment

A topic that we will visit regularly is the use of safety culture assessments to assign quantitative values to the condition of a specific organization and even the individual departments and working groups within the organization.  One reason for this focus is the emphasis on safety culture assessments as a response to situations where organizational performance does not meet expectations and “culture” is believed to be a factor.  Both the NRC and the nuclear industry appear aligned on the use of assessments as a response to performance issues and even as an ongoing prophylactic tool.  But, are these assessments useful?  Or accurate?  Do they provide insights into the origins of cultural deficiencies?

One question that frequently comes to mind is, can safety culture be separated from the manifestation of culture in terms of the specific actions and decisions taken by an organization?  For example, if an organization makes some decisions that are clearly at odds with “safety being the overriding priority”, can the culture of the organization not be deficient?  But if an assessment of the culture is performed, and the espoused beliefs and priorities are generally supportive of safety, what is to be made of those responses? 

The reference material for this post comes from some work led by the late Bernhard Wilpert of the Berlin University of Technology.  (We will sample a variety of his work in the safety management area in future posts.)   It is a brief slide presentation titled, “Challenges and Opportunities of Assessing Safety Culture”.  Slide 3 for example revisits E. H. Schein’s multi-dimensional formulation of safety culture which suggests that assessments must be able to expose all levels of culture and their integrated effect. 

Two observations from these slides seem of particular note.  They are both under Item 4, Methodological Challenges.  The first observation is that culture is not a quantifiable phenomenon and does not lend itself easily to benchmarking.  This bears consideration as most assessment methods being used today employ some statistical comparisons to assessments at other plants, including percentile type ranking.   The other observation in the slide is that culture results from the learning experience of its members.  This is of particular interest to us as it supports some of the thinking associated with a systems dynamics approach.  A systems view involves the development of shared “mental models” of how safety management “works”; the goal being that individual actions and decisions can be understood within a commonly understood framework.  The systems process becomes, in essence, the mechanism for translating beliefs into actions.


Link to slide presentation

Thursday, August 13, 2009

Primer on System Dynamics

System Dynamics is a concept for seeing the world in terms of inputs and outputs, where internal feedback loops and time delays can affect system behavior and lead to complex, non-linear changes in system performance.

The System Dynamics worldview was originally developed by Prof. Jay Forrester at MIT. Later work by other thinkers, e.g., Peter Senge, author of The Fifth Discipline, expanded the original concepts and made them available to a broader audience. An overview of System Dynamics can be found on Wikipedia.

Our NuclearSafetySim program uses System Dynamics to model managerial behavior in an environment where maintaining the nuclear safety culture is a critical element. NuclearSafetySim is built using isee Systems iThink software. isee Systems has educational materials available on their website that explain some basic concepts.

There are other vendors in the System Dynamics software space, including Ventana Systems and their Vensim program. They also provide some reference materials, available here.

Thursday, August 6, 2009

Signs of a Reactive Organization (MIT #6)

One of the most important insights to be gained from a systems perspective of safety management is the effectiveness of various responses to changes in system conditions.  Recall that in our post #3 on the MIT paper, we talked about single versus double loop learning.  Single loop response included short term, local responses to perceived problems while double loop referred to gaining an understanding of the underlying reasons for the problems and finding long term solutions.  As you might guess, single loop responses tend to be reactive.  “An oscillating incident rate is the hallmark of a reactive organization, where successive crises lead to short term fixes that persist only until the next crisis.” [pg 22]  We can use our NuclearSafetySim model to illustrate differing approaches to managing problems.

The figure below illustrates how the number of problems/issues (we use the generic term "challenges" in NuclearSafetySim) might vary with time when the response is reactive.  The blue line indicates the total number of issues, the pink line the number of new issues being identified and the green line, the resolution rate for issues, e.g., through a corrective action program.  Note that the blue line initially increases and then oscillates while the pink line is relatively constant.  The oscillation derives from the management response, reflected in the green line, where there is an initial delay in responding to an increased numbers of issues, then resolution rates are greatly increased to address higher backlogs, then reduced (due to budgetary pressures and other priorities) when backlogs start to fall, precipitating another cycle of increasing issues.




Compare the oscillatory response above to the next figure where an increase in issues results immediately in higher resolution rates that are maintained over a period sufficient to return the system to a lower level of backlogs.  In parallel, budgets are increased to address the underlying causes of issues, driving down the occurrence rate of new issues and ultimately bringing backlog down to a long-term sustainable level.



The last figure shows some of the ramifications of system management on safety culture and employee trust.  The significant increase in issues backlog initially leads to a degradation of employee trust (the pink line) and an erosion in safety culture (blue line).  However the nature and effectiveness of the management response in bringing down backlogs and reducing new issues reverses the trust trend line and rebuilds safety culture over time.  Note the red line, representing plant performance, is relatively unchanged over the same period indicating that performance issues may exist under the cover of a consistently operating plant.

Tuesday, August 4, 2009

The Economist on Computer Simulation

The Economist has occasional articles on the practical applications of computer simulation. Following are a couple of items that have appeared in the last year.

Agent-based simulation is used to model the behavior of crowds. "Agent-based" means that each individual has some capacity to ascertain what is going on in the environment and act accordingly. This approach is being used to simulate the movement of people in a railroad station or during a building fire. On a much larger scale, each of the computer-generated orcs in the "Lord of the Rings" battle scenes moved independently based on his immediate surroundings.

Link to
article.

The second article is a brief review of simulation's use in business applications, including large-scale systems (e.g., an airline), financial planning, forecasting, process mapping and Monte Carlo analysis. This is a quick read on the ways simulation is used to illustrate and analyze a variety of complex situations.

Link to
article
.

Other informational resources that discuss simulation are included on our
References
page.

Monday, August 3, 2009

Reading List: Just Culture by Sidney Dekker

Thought I would share with you a relatively recent addition to the safety management system bookshelf, Just Culture by Sidney Dekker, Professor of Human Factors and System Safety at Lund University in Sweden.  In Dekker’s view a “just culture” is critical for the creation of safety culture.  A just culture will not simply assign blame in response to a failure or problem, it will seek to use accountability as a means to understand the system-based contributors to failure and resolve those in a manner that will avoid recurrence.  One of the reasons we believe so strongly in safety simulation is the emphasis on system-based understanding, including a shared organizational mental model of how safety management happens.  One reviewer (D. Sillars) of this book on the amazon.com website summarizes, “’Just culture’ is an abstract phrase, which in practice, means . . . getting to an account of failure that can both satisfy demands for accountability while contributing to learning and improvement.” 


Question for nuclear professionals:  Does your organization maintain a library of resources such as Just Culture or Dianne Vaughan’s book, The Challenger Launch Decision, that provide deep insights into organizational performance and culture?  Are materials like this routinely the subject of discussions in training sessions and topical meetings?

Thursday, July 30, 2009

“Reliability is a Dynamic Non-Event” (MIT #5)

What is this all about?  Reliability is a dynamic non-event [MIT paper pg 5].  It is about complacency.  Paradoxically, when incident rates are low for an extended period of time and if management does not maintain a high priority on safety, the organization may slip into complacency as individuals shift their attention to other priorities such as production pressures.  The MIT authors note the parallel to the NASA space program where incidents were rare notwithstanding a weak safety culture, resulting in the organization rationalizing its performance as “normal”.  (See Dianne Vaughan’s book The Challenger Launch Decision for a compelling account of NASA’s organizational dynamics.)  In our paper “Practicing Nuclear Safety Management” we make a similar comparison.

What does this imply about the nuclear industry?  Certainly we are in a period where the reliability of the plants is at a very high level and the NRC ROP indicator board is very green.  Is this positive for maintaining high safety culture levels or does it represent a potential threat?  It could be the latter since the biggest problem in addressing the safety implications of complacency in an organization is, well, complacency.

Wednesday, July 29, 2009

Self Preservation (MIT #4)

The MIT paper [pg 7] introduces the concept of feedback loops, an essential ingredient of systems dynamics, and critical to understanding the dynamics of safety management.  The MIT authors suggest that there is a “weak balancing loop” associated with individuals responding to a perceived personal threat associated with increased incident rates.  While the authors acknowledge it is a weak feedback, I would add that, at best, it represents an idealized effect and is hard to differentiate from other feedbacks that individuals receive such as management reaction to incidents and pressures associated with cost and plant performance.  The MIT paper [pg 8] goes on to address management actions and states, “When faced with an incident rate that is too high, the natural and most immediately effective response for managers is to focus the blame on individual compliance with rules.”  Note the conditional phrase, “most immediately effective” as it is an example of single loop learning as described in one of my prior posts (MIT #3).  Certainly the fact that procedure adherence is an issue that recurs at many nuclear plants suggests that the “blame game” has limited and short term effectiveness.

My sense is that the self preservation effect is one that exists deeply embedded within the larger safety climate of the organization.  In that climate how strictly is rule adherence observed?  Are procedures and processes of sufficient quality to enhance observance?  If procedures and processes are ambiguous or even incorrect, and left uncorrected, is there a tacit approval of alternate methods?  The reality is self preservation can act in several directions – it may impel compliance, if that is truly the organizational ethic, or it could rationalize non-compliance if that is an organizational expectation.  Life is difficult.

"Beaten to Death by Croutons"

In the July 27, 2009 Wall Street Journal in the Bookshelf column, there is a review of "Say Everything", a book about blogging.  In the review, there is a comment that "reading blogs is like being beaten to death by croutons".  We hope that readers of our blog do not experience such a fate.  The column goes on to note that the best blogs are those that are concise, current, and precisely targeted.  That is the goal for this blog and we hope it is being met.

Single Loop, Double Loop – What Is This All About? (MIT #3)

One of the potential benefits of academic papers is the opportunity for theoretical structure to be put forward to explain a set of observational experience.  The MIT paper [pg 4] provides such a theory regarding organizational learning and safety culture.  They cite the difference between “single loop” and “double loop” learning as vital to the way organizations respond to performance problems.  Single loop learning “represents the immediate and local actions that individuals and organizations take in response to a perceived problem.”  On the other hand, double loop learning “instead of focusing on enforcement…question[s] why rules were not originally followed…”. 

The MIT authors contend that double loop offers the greatest potential benefit to safety, but can be a difficult challenge since “it threatens existing bureaucratic structures”.  And they add an insight that derives from their (and our) view of safety as a dynamic process: “the immediate success of single loop learning can undermine both the motivation and the perceived need to follow through on more substantial improvement efforts…”

How does the theory of single and double loop resonate with your experience?  Do you see single loop being the dominant response within your organization?

Monday, July 27, 2009

Worth Noting - NRC Chairman's Comments

The link below is to a recent interview with Gregory Jaczko, newly appointed Chairman of the Nuclear Regulatory Commission.  In the interview he indicates that he wants to reinforce the need to maintain a safety culture at the agency and the nuclear industry.  Safety culture has been an ongoing theme of much of Chairman Jaczko's public statements since coming on the Commission four years ago, and still seems to be on his mind.

Link to article
.

Organizational Learning (MIT #2)

The central question posed in the MIT paper is: What are the contributors to an organization’s ability to learn and sustain a robust safety culture?  According to the authors, “Here, the focus shifts from prescribing elements of an effective safety culture to managers to an examination of why it is that organizations so often fail to learn…. instead of focusing on enforcement, individuals might question why the rules were not originally followed” [p. 4]  In our paper “Practicing Nuclear Safety Management” we ask the same question.  I have reviewed the presentation materials from the NRC’s safety culture meetings earlier this year.  There is almost total emphasis on actions such as safety culture surveys to assess the state of the organization and various remedial measures to correct any deviations from prescribed rules, but no real questioning of what causes personnel to disregard established safety expectations.

If any readers can provide examples, e.g., presentation materials or assessments, where nuclear organizations have attempted to answer the question “Why?”, please provide a comment below along with appropriate links to the references.  It would greatly help the discussion.

Friday, July 24, 2009

Safety Culture Insights from Simulation (MIT #1)

Starting with this post we are reviewing an interesting paper from the Sloan School of Management at MIT - “Preventing Accidents and Building A Culture of Safety: Insights from a Simulation Model.”  While the paper  approaches organizational safety performance on a generic basis – it is not specific to nuclear facilities - it offers many useful insights that are highly applicable to nuclear organizations.  MIT as an institution is known for its work in systems dynamics and simulation modeling.  These disciplines have been used to analyze a variety of safety and accident environments including NASA and nuclear operations.  Also, as you may have noticed on NuclearSafetySim.com, our development of nuclear safety management simulation models is based on systems dynamics.

Future posts will highlight several of the key insights from this paper and their applicability to issues of nuclear safety management.



Link to paper.

Thursday, July 23, 2009

Can Driving and Texting Coexist?

In the July 18, 2009 online edition of The New York Times, there is an interesting example of the use of a simulation game to illustrate the impact of texting on a driver’s ability to drive safety and react to changing road conditions. Upon completion of the game, the player is provided with a quantification of his driving performance with and without the distraction presented by receiving and sending text messages.

I thought this would be interesting to nuclear safety management practitioners for several reasons. First, it is another illustration of how simulation games can provide realistic experiences of situations they may have to manage in real life - without the risks associated with the real life activity.

Second, this game demonstrates the impact of competing priorities (texting and driving) on the ability of the driver to maintain performance at a consistent level. In the nuclear operations world, safety management failures are often associated with the impact of competing priorities or pressures on the ability of personnel to perform reliably. The driving game suggests that there is always some diminution of performance due to the competing priority of texting. Is that true of nuclear safety management or is it possible, with sufficient training and practice, to manage competing priorities?

Link to article.

Foreign Nuclear Plant Problems Cast a Long Shadow

Recent news items refer to the Swedish power company Vattenfall and problems that have occurred at two of their plants: Ringhals in Sweden and Krummel in Germany.  In both cases the underlying causes of the problems and/or reactions to the events revealed safety culture issues.  These are just two recent examples of the ongoing prevalence of safety culture issues in the global nuclear industry.  Part of the larger picture is the impact on the debate in Germany about any continuing role for nuclear power, even for the existing plants.  The performance of Vattenfall has created political problems for German Chancellor Merkel and the other principals in Germany's nuclear industry.  This highlights the threat of a safety culture failure in one organization to cast a large shadow over the future of the industry.

The situation at Ringhals is discussed here.

There is a lengthy discussion of Krummel on Spiegel Online