Saturday, May 1, 2010

Why is Nuclear Different?

We saw a very interesting observation in a recent World Nuclear News item describing updates to World Association of Nuclear Operators’ structure. The WANO managing director said “Any CEO must ensure their own facilities are safe but also ensure every other facility is safe. [emphasis added] It's part of their commitment to investors to do everything they can to ensure absolute safety and the one CEO that doesn't believe in this concept will risk the investment of every other.” As WNN succinctly put it, “These company heads are hostages of one another when it comes to nuclear safety.”

I think it's true that nuclear operators are joined at the wallet, but why? In most industries, a problem at one competitor creates opportunities for others. Why is the nuclear industry so tightly coupled and at constant risk of contagion? Is it the mystery and associated fears, suspicion and, in some cases, local visibility that attends nuclear?


Coal mining and oil exploration exist in sharp contrast to nuclear. "Everyone knows" coal mining is dirty and dangerous but bad things only happen, with no wide-ranging effects, to unfortunate folks in remote locations. Oil exploration is somewhat more visible: people will be upset for awhile over the recent blow-out in the Gulf of Mexico, offshore drilling will be put on a temporary hold, but things will eventually settle down. In the meantime, critics will use BP as a punching bag (again) but there will be no negative spillover to, say, Chevron.

Wednesday, April 28, 2010

Safety Culture: Cause or Context (part 2)

In an earlier post, we discussed how “mental models” of safety culture affect perceptions about how safety culture interacts with other organizational factors and what interventions can be taken if safety culture issues arise. We also described two mental models, the Causal Attitude and Engineered Organization. This post describes a different mental model, one that puts greater emphasis on safety culture as a context for organizational action.

Safety Culture as Emergent and Indeterminate

If the High Reliability Organization model is basically optimistic, the Emergent and Indeterminate model is more skeptical, even pessimistic as some authors believe that accidents are unavoidable in complex, closely linked systems. In this view, “the consequences of safety culture cannot be engineered and only probabilistically predicted.” Further, “safety is understood as an elusive, inspirational asymptote, and more often only one of a number of competing organizational objectives.” (p. 356)* Safety culture is not a cause of action, but provides the context in which action occurs. Efforts to exhaustively model (and thus eventually manage) the organization are doomed to failure because the organization is constantly adapting and evolving.

This model sees that the same processes that produce the ordinary and routine stuff of everyday organizational life also produce the messages of impending problems. But the organization’s necessary cognitive processes tend to normalize and homogenize; the organization can’t very well be expected to treat every input as novel or not previously experienced. In addition, distributed work processes and official security policies can limit the information available to individuals. Troublesome information may be buried or discredited. And finally, “Dangers that are neither spectacular, sudden, nor disastrous, or that do not resonate with symbolic fears, can remain ignored and unattended, . . . . “ (p. 357)

We don’t believe safety significant events are inevitable in nuclear organizations but we do believe that the hubris of organizational designers can lead to specific problems, viz., the tendency to ignore data that does not comport with established categories. In our work, we promote a systems approach, based on system dynamics and probabilistic thinking, but we recognize that any mental or physical model of an actual, evolving organization is just that, a model. And the problem with models is that their representation of reality, their “fit,” can change with time. With ongoing attention and effort, the fit may become better but that is a goal, not a guaranteed outcome.

Lessons Learned

What are the takeaways from this review? First, mental models are important. They provide a framework for understanding the world and its information flows, a framework that the holder may believe to be objective but is actually quite subjective and creates biases that can cause the holder to ignore information that doesn’t fit into the model.

Second, the people who are involved in the safety culture discussion do not share a common mental model of safety culture. They form their models with different assumptions, e.g., some think safety culture is a force that can and does affect the vector of organizational behavior, while others believe it is a context that influences, but does not determine, organizational and individual decisions.

Third, safety culture cannot be extracted from its immediate circumstances and examined in isolation. Safety culture always exists in some larger situation, a world of competing goals and significant uncertainty with respect to key factors that determine the organization’s future.

Fourth, there is a risk of over-reliance on surveys to provide some kind of "truth" about an organization’s safety culture, especially if actual experience is judged or minimized to fit the survey results. Since there is already debate about what surveys measure (safety culture or safety climate?), we advise caution.

Finally, in addition to appropriate models and analyses, training, supervision and management, the individual who senses that something is just not right and is supported by an organization that allows, rather than vilifies, alternative interpretations of data is a vital component of the safety system.


* This post draws on Susan S. Silbey, "Taming Prometheus: Talk of Safety and Culture," Annual Review of Sociology, Volume 35, September 2009, pp. 341-369.

Monday, April 26, 2010

The Massey Mess

A postscript to our prior posts re the Massey coal mine explosion two weeks ago. The fallout of the safety issues at Massey mines is reaching a crescendo as even the President of the United States is quoted as stating that the accident was "a failure, first and foremost, of management."

The full article is embedded in this post. It is clear that Massey will be under the spotlight for some time to come with Federal and state investigations being initiated. One wonders if the CEO will or should survive the scrutiny. For us the takeaway from this, and other examples such as Vermont Yankee, is a reminder not to underestimate the potential consequences of safety culture failures. They point directly at the safety management system including management itself. Once that door is opened, the ripple effects can range far downstream and throughout a business.

“Serious Systemic Safety Problem” at BP

Yes, it’s BP in the news again. In case you hadn’t noticed, the fire, explosion and loss of life last week was at an offshore drilling rig working for BP. In the words of an OSHA official BP still has a “serious, systemic safety problem” across the company. Check the embedded web page for the story.

Thursday, April 22, 2010

Classroom in the Sky

The opening of much of the European airspace in the last several days has provided a rare opportunity to observe in real time some dynamics of safety decision making. On the one hand there have been the airlines who have been contending it is safe to resume flights and on the other the regulatory authorities who had been taking a more conservative stance. The question posed in our prior post was to what extent were business pressures influencing the airlines position and to what extent might that pressure, perhaps transmuting into political pressure, influence regulators decisions. Perhaps most importantly, how could one tell?

We have extracted some interesting quotes from recent media reporting of the issue.





Civil aviation officials said their decision to reopen terminals where thousands of weary travelers had camped out was based on science, not on the undeniable pressure put on them by the airlines....’The only priority that we consider is safety. We were trying to assess the safe operating levels for aircraft engines with ash,’ said Eamonn Brennan, chief executive of Irish Aviation Authority. “Pressure to restart flights had been intense.” Despite their protests, the timing of some reopenings seemed dictated by airlines' commercial pressures.

"It's important to realize that we've never experienced in Europe something like this before....We needed the four days of test flights,the empirical data, to put this together and to understand the levels of ash that engines can absorb. Even as airports reopened, a debate swirled about the safety of flying without more extensive analysis of the risks, as it appeared that governments were operating without consistent international guidelines based on solid data. "What's missing is some sort of standard, based on science, that gives an indication of a safe level of volcanic ash..." “Some safety experts said pressure from hard-hit airlines and stranded passengers had prompted regulators to venture into uncharted territory with respect to the ash. In the past, the key was simply to avoid ash plumes.”

How can it be both ways - regulators did not respond to pressure or regulators only acted based on new analysis of data? The answer may lie in our old friend, the theory of normalization of deviation. ref Challenger Launch Decision). As we have discussed in prior posts normalization is a process whereby an organization’s safety standards maybe reinterpreted (to a lower or more accommodating level)over time due to complex interactions of cultural, organizational, regulatory and environmental factors. The fascinating aspects are that this process is not readily apparent to those involved and decisions then made in accordance with the reinterpreted standards are not viewed as deviant or inconsistent with, e.g., “safety is our highest priority”. Thus events which heretofore were viewed as discrepant,no longer are and are thus “normalized”. Aviation authorities believe their decisions are entirely safety-based. Yet to observers outside the organization it very much appears that the bar has been lowered, since what is considered safe today was not yesterday.

Monday, April 19, 2010

The View On The Ground

A brief follow-up to the prior post re situational factors that could be in play in reaching a decision about resuming airline flights in Europe.  Fox News has been polling to assess the views of the public and the results of the poll are provided in the adjacent box.  Note that the overwhelming sentiment is that safety should be the priority.  Also note the wording of the choices, where the “yes” option appears to imply that flights should be resumed based on the “other” priorities such as money and passenger pressure, while the “no” option is based on safety being the priority.  Obviously the wording makes the “yes” option appear to be one where safety may be sacrificed.

So the results are hardly surprising.  But what do the results really mean?  For one it reminds us of the importance of the wording of questions in a survey.  It also illustrates how easy it is to get a large positive response that “safety should be the priority”.  Would the responses have been different if the “yes” option made it clear that airlines believed it was safe to fly and had done test flights to verify?  Does the wording of the “no” option create a false impression that an “all clear” (presumably by regulators) would equate to absolute safety, or at least be arrived at without consideration of other factors such as the need to get air travel resumed? 

Note:  Regulatory authorities in Europe agreed late today to allow limited resumption of air travel starting Tuesday.  Is this an “all clear” or a more nuanced determination that it is safe enough?

The View from 30,000 Feet

The last few days news has been dominated by the ongoing eruption of the volcano in Iceland and its impact on air travel across Europe.  The safety issue is the potential for the ash cloud, at around the 30,000 feet altitude, to seriously damage aircraft jet engines.  Thus the air safety regulators have closed the air space for the last 4 days, creating huge backlogs of passengers and costing airlines $200 million per day.  So we have a firsthand study in the situational dynamics of safety culture.

For the first several days there appeared to be general consensus between the airlines, the safety regulators and politicians that closing the airspace was necessary and prudent.  Sunday the airlines broke ranks and are openly lobbying for resumption of flights.  Several of the major airlines flew test flights to assess the performance of aircraft and now contend it is safe to fly. 

Regulators so far have insisted on keeping the airspace shut down with the earliest resumption possible by Monday evening. 

Are the airlines and the regulators simply reaching different conclusions based on the same information?  Or are airlines feeling the pressures of money and customers and the regulators are not?  Or are the regulators simply being more conservative?  Interestingly there does not appear to have been much overt political pressure to date.  Would you expect regulators to be more sensitive to this source of pressure?  Or immune from it?  How would you know?

I don’t know the answers to these questions but I do think it is unlikely that the situational parameters are not playing some role here.  In fact it seems hard to explain the different points of view without them.  But if it is true, does it necessarily mean that the airlines’ safety cultures are not robust - or is it also possible that the airlines have done exactly what safety culture demands - according safety its appropriate priority but still reaching a decision that flights can be resumed safely?  A robust safety culture does not demand insulation from situational factors, just that they not inappropriately skew the balancing of safety and other business needs.  How exactly one does that in a transparent manner is perhaps the most important indicator of safety culture.