Showing posts with label Normalization of Deviance. Show all posts
Showing posts with label Normalization of Deviance. Show all posts

Wednesday, December 5, 2012

Drift Into Failure by Sydney Dekker

Sydney Dekker's Drift Into Failure* is a noteworthy effort to provide new insights into how accidents and other bad outcomes occur in large organizations. He begins by describing two competing world views, the essentially mechanical view of the world spawned by Newton and Descartes (among others), and a view based on complexity in socio-technical organizations and a systems approach. He shows how each world view biases the search for the “truth” behind how accidents and incidents occur.

Newtonian-Cartesian (N-C) Vision

Issac Newton and Rene Descartes were leading thinkers during the dawn of the Age of Reason. Newton used the language of mathematics to describe the world while Descartes relied on the inner process of reason. Both believed there was a single reality that could be investigated, understood and explained through careful analysis and thought—complete knowledge was possible if investigators looked long and hard enough. The assumptions and rules that started with them, and were extended by others over time, have been passed on and most of us accept them, uncritically, as common sense, the most effective way to look at the world.

The N-C world is ruled by invariant cause-and-effect; it is, in fact, a machine. If something bad happens, then there was a unique cause or set of causes. Investigators search for these broken components, which could be physical or human. It is assumed that a clear line exists between the broken part(s) and the overall behavior of the system. The explicit assumption of determinism leads to an implicit assumption of time reversibility—because system performance can be predicted from time A if we know the starting conditions and the functional relationships of all components, then we can start from a later time B (the bad outcome) and work back to the true causes. (p. 84) Root cause analysis and criminal investigations are steeped in this world view.

In this view, decision makers are expected to be rational people who “make decisions by systematically and consciously weighing all possible outcomes along all relevant criteria.” (p. 3) Bad outcomes are caused by incompetent or worse, corrupt decision makers. Fixes include more communications, training, procedures, supervision, exhortations to try harder and criminal charges.

Dekker credits Newton et al for giving man the wherewithal to probe Nature's secrets and build amazing machines. However, Newtonian-Cartesian vision is not the only way to view the world, especially the world of complex, socio-technical systems. For that a new model, with different concepts and operating principles, is required.

The Complex System

Characteristics

The sheer number of parts does not make a system complex, only complicated. A truly complex system is open (it interacts with its environment), has components that act locally and don't know the full effects of their actions, is constantly making decisions to maintain performance and adapt to changing circumstances, and has non-linear interactions (small events can cause large results) because of multipliers and feedback loops. Complexity is a result of the ever-changing relationships between components. (pp.138-144)

Adding to the myriad information confronting a manager or observer, system performance is often optimized at the edge of chaos, where competitors are perpetually vying for relative advantage at an affordable cost.** The system is constantly balancing its efforts between exploration (which will definitely incur costs but may lead to new advantages) and exploitation (which reaps benefits of current advantages but will likely dissipate over time). (pp. 164-165)

The most important feature of a complex system is that it adapts to its environment over time in order to survive. And its environment is characterized by resource scarcity and competition. There is continuous pressure to maintain production and increase efficiency (and their visible artifacts: output, costs, profits, market share, etc) and less visible outputs, e.g., safety, will receive less attention. After all, “Though safety is a (stated) priority, operational systems do not exist to be safe. They exist to provide a service or product . . . .” (p. 99) And the cumulative effect of multiple adaptive decisions can be an erosion of safety margins and a changed response of the entire system. Such responses may be beneficial or harmful—a drift into failure.

Drift by a complex system exhibits several characteristics. First, as mentioned above, it is driven by environmental factors. Second, drift occurs in small steps so changes can be hardly noticed, and even applauded if they result in local performance improvement; “. . . successful outcomes keep giving the impression that risk is under control” (p. 106) as a series of small decisions whittle away at safety margins. Third, these complex systems contain unruly technology (think deepwater drilling) where uncertainties exist about how the technology may be ultimately deployed and how it may fail. Fourth, there is significant interaction with a key environmental player, the regulator, and regulatory capture can occur, resulting in toothless oversight.

“Drifting into failure is not so much about breakdowns or malfunctioning of components, as it is about an organization not adapting effectively to cope with the complexity of its own structure and environment.” (p. 121) Drift and occasionally accidents occur because of ordinary system functioning, normal people going about their regular activities making ordinary decisions “against a background of uncertain technology and imperfect information.” Accidents, like safety, can be viewed as an emergent system property, i.e., they are the result of system relationships but cannot be predicted by examining any particular system component.

Managers' roles

Managers should not try to transform complex organizations into merely complicated ones, even if it's possible. Complexity is necessary for long-term survival as it maximizes organizational adaptability. The question is how to manage in a complex system. One key is increasing the diversity of personnel in the organization. More diversity means less group think and more creativity and greater capacity for adaptation. In practice, this means validation of minority opinions and encouragement of dissent, reflecting on the small decisions as they are made, stopping to ponder why some technical feature or process is not working exactly as expected and creating slack to reduce the chances of small events snowballing into large failures. With proper guidance, organizations can drift their way to success.

Accountability

Amoral and criminal behavior certainly exist in large organizations but bad outcomes can also result from normal system functioning. That's why the search for culprits (bad actors or broken parts) may not always be appropriate or adequate. This is a point Dekker has explored before, in Just Culture (briefly reviewed here) where he suggests using accountability as a means to understand the system-based contributors to failure and resolve those contributors in a manner that will avoid recurrence.

Application to Nuclear Safety Culture

A commercial nuclear power plant or fleet is probably not a complete complex system. It interacts with environmental factors but in limited ways; it's certainly not directly exposed to the Wild West competition of say, the cell phone industry. Group think and normalization of deviance*** is a constant threat. The technology is reasonably well-understood but changes, e.g., uprates based on more software-intensive instrumentation and control, may be invisibly sanding away safety margin. Both the industry and the regulator would deny regulatory capture has occurred but an outside observer may think the relationship is a little too cozy. Overall, the fit is sufficiently good that students of safety culture should pay close attention to Dekker's observations.

In contrast, the Hanford Waste Treatment Plant (Vit Plant) is almost certainly a complex system and this book should be required reading for all managers in that program.

Conclusion

Drift Into Failure is not a quick read. Dekker spends a lot of time developing his theory, then circling back to further explain it or emphasize individual pieces. He reviews incidents (airplane crashes, a medical error resulting in patient death, software problems, public water supply contamination) and descriptions of organization evolution (NASA, international drug smuggling, “conflict minerals” in Africa, drilling for oil, terrorist tactics, Enron) to illustrate how his approach results in broader and arguably more meaningful insights than the reports of official investigations. Standing on the shoulders of others, especially Diane Vaughan, Dekker gives us a rich model for what might be called the “banality of normalization of deviance.” 


* S. Dekker, Drift Into Failure: From Hunting Broken Components to Understanding Complex Systems (Burlington VT: Ashgate 2011).

** See our Sept. 4, 2012 post onCynefin for another description of how the decisions an organization faces can suddenly slip from the Simple space to the Chaotic space.

*** We have posted many times about normalization of deviance, the corrosive organizational process by which the yesterday's “unacceptable” becomes today's “good enough.”

Monday, February 13, 2012

Is Safety Culture An Inherently Stable System?

The short answer:  No.

“Stable” means that an organization’s safety culture effectiveness remains at about the same level* over time.  However, if a safety culture effectiveness meter existed and we attached it to an organization, we would see that, over time, the effectiveness level rises and falls, possibly even dropping to an unacceptable level.  Level changes occur because of shocks to the system and internal system dynamics.

Shocks

Sudden changes or challenges to safety culture stability can originate from external (exogenous) or internal (endogenous) sources.

Exogenous shocks include significant changes in regulatory requirements, such as occurred after TMI or the Browns Ferry fire, or “it’s not supposed to happen” events that do, in fact, occur, such as a large earthquake in Virginia or a devastating tsunami in Japan that give operators pause, even before any regulatory response.

Organizations have to react to such external events and their reaction is aimed at increasing plant safety.  However, while the organization’s focus is on its response to the external event, it may take its eye off the ball with respect to its pre-existing and ongoing responsibilities.  It is conceivable that the reaction to significant external events may distract the organization and actually lower overall safety culture effectiveness.

Endogenous shocks include the near-misses that occur at an organization’s own plant.  While it is unfortunate that such events occur, it is probably good for safety culture, at least for awhile.  Who hasn’t paid greater attention to their driving after almost crashing into another vehicle?

The insertion of new management, e.g., after a plant has experienced a series of performance or regulatory problems, is another type of internal shock.  This can also raise the level of safety culture—IF the new management exercises competent leadership and makes progress on solving the real problems. 

Internal Dynamics    

Absent any other influence, safety culture will not remain at a given level because of an irreducible tendency to decay.  Decay occurs because of rising complacency, over-confidence, goal conflicts, shifting priorities and management incentives.  Cultural corrosion, in the form of normalization of deviance, is always pressing against the door, waiting for the slightest crack to appear.  We have previously discussed these challenges here.

An organization may assert that its safety culture is a stability-seeking system, one that detects problems, corrects them and returns to the desired level.  However, performance with respect to the goal may not be knowable with accuracy because of measurement issues.  There is no safety culture effectiveness meter, surveys only provide snapshots of instant safety climate and even a lengthy interview-based investigation may not lead to repeatable results, i.e, a different team of evaluators might (or might not) reach different conclusions.  That’s why creeping decay is difficult to perceive. 

Conclusion

Many different forces can affect an organization’s safety culture effectiveness, some pushing it higher while others lower it.  Measurement problems make it difficult to know what the level is and the trend, if any.  The takeaway is there is no reason to assume that safety culture is a stable system whose effectiveness can be maintained at or above an acceptable level.


*  “Level” is a term borrowed from system dynamics, and refers to the quantity of a variable in a model.  We recognize that safety culture is an organizational property, not something stored in a tank, but we are using “level” to communicate the notion that safety culture effectiveness is something that can improve (go up) or degrade (go down).

Thursday, January 12, 2012

Problems at Palisades—A Case of Normalization of Deviance?

The Palisades nuclear plant is in trouble with the NRC.  On Jan. 11, 2012 the NRC met with Entergy (the plant’s owner and operator) to discuss two preliminary inspection findings, one white and one yellow.  Following is the NRC summary of the more significant event.

 “The preliminary yellow finding of substantial significance to safety is related to an electrical fault caused by personnel at the site. The electrical fault resulted in a reactor trip and the loss of half of the control room indicators, and activation of safety systems not warranted by actual plant conditions. This made the reactor trip more challenging for the operators and increased the risk of a serious event occurring. The NRC conducted a Special Inspection and preliminarily determined the actions and work preparation for the electrical panel work were not done correctly.”*

At the meeting with NRC, an Entergy official said “Over time, a safety culture developed at the plant where workers thought if they had successfully accomplished a task in the past, they could do it again without strictly following procedure [emphasis added]. . . .

Management also accepted that, and would reward workers for getting the job done. This led to the events that caused the September shutdown when workers did not follow the work plan while performing maintenance.”**

In an earlier post, we defined normalization of deviance as “the gradual acceptance of performance results that are outside normal acceptance criteria.”  In the Palisades case, we don’t know anything more than the published reports but it sure looks to us like an erosion of performance standards, an erosion that was effectively encouraged by management.

Additional Background on Palisades

This is not Palisades’ first trip to the woodshed.  Based on a prior event, the NRC had already demoted Palisades from the Reactor Oversight Process (ROP) Licensee Response Column to the Regulatory Response Column, meaning additional NRC inspections and scrutiny.  And they may be headed for the Degraded Cornerstone Column.***  But it’s not all bad news.  At the end of the third quarter 2011, Palisades had a green board on the ROP.****  Regular readers know our opinion with respect to the usefulness of the ROP performance matrices.


*  NRC news release, “NRC to Hold Two Regulatory Conferences on January 11 to Discuss Preliminary White and Preliminary Yellow Findings at Palisades Nuclear Plant,” nrc.gov (Jan. 5, 2012).

**  F. Klug, “Decline in safety culture at Palisades nuclear power plant to be fixed, company tells regulators,” Kalamazoo Gazette on mlive.com (Jan. 11, 2012).

***  B. Devereaux, “Palisades nuclear plant bumped down in status by NRC; Entergy Nuclear to dispute other findings next week,” mlive.com (Jan. 4, 2012).

****  Palisades 3Q/2011 Performance Summary, nrc.gov (retrieved Jan. 12, 2012).

Thursday, January 5, 2012

2011 End of Year Summary

We thought we would take this opportunity to do a little rummaging around in the Google analytics and report on some of the statistics for the safetymatters blog.

The first thing that caught our attention was the big increase in page views (see chart below) for the blog this past year.  We are now averaging more than 1000 per month and we appreciate every one of the readers who visits the blog.  We hope that the increased readership reflects that the content is interesting, thought provoking and perhaps even a bit provocative.  We are pretty sure people who are interested in nuclear safety culture cannot find comparable content elsewhere.

The following table lists the top ten blog posts.  The overwhelming favorite has been the "Normalization of Deviation" post from March 10, 2010.  We have consistently commented positively on this concept introduced by Diane Vaughan in her book The Challenger Launch Decision.  Most recently Red Conner noted in his December 8, 2011 post the potential role of normalization of deviation in contributing to complacency.  This may appear to be a bit of a departure from the general concept of complacency as primarily a passive occurrence.  Red notes that the gradual and sometimes hardly perceptive acceptance of lesser standards or non-conforming results may be more insidious than a failure to challenge the status quo.  We would appreciate hearing from readers on their views of “normalization”, whether they believe it is occurring in their organizations (and if so how is it detected?) and what steps might be taken to minimize its effect.



A common denominator among a number of the popular posts is safety culture assessment, whether in the form of surveys, performance indicators, or other means to gauge the current state of an organization.  Our sense is there is a widespread appetite for approaches to measuring safety culture in some meaningful way; such interest perhaps also indicates that current methods, heavily dependent on surveys, are not meeting needs.  What is even more clear in our research is the lack of initiative by the industry and regulators to promote or fund research into this critical area.   

A final observation:  The Google stats on frequency of page views indicate two of the top three pages were the “Score Decision” pages for the two decision examples we put forward.  They each had a 100 or more views.  Unfortunately only a small percentage of the page views translated into scoring inputs for the decisions.  We’re not sure why the lack of inputs since they are anonymous and purely a matter of the reader’s judgment.  Having a larger data set from which to evaluate the decision scoring process would be very useful and we would encourage anyone who did visit but not score to reconsider.  And of course, anyone who hasn’t yet visited these examples, please do and see how you rate these actual decisions from operating nuclear plants.

Thursday, December 8, 2011

Nuclear Industry Complacency: Root Causes

NRC Chairman Jaczko, addressing the recent INPO CEO conference, warned about possible increasing complacency in the nuclear industry.*  To support his point, he noted the two plants in column four of the ROP Action Matrix and two plants in column three, the increased number of special inspections in the past year, and the three units in extended shutdowns.  The Chairman then moved on to discuss other industry issues. 

The speech spurred us to ask: Why does the risk of complacency increase over time?  Given our interest in analyzing organizational processes, it should come as no surprise that we believe complacency is more complicated than the lack of safety-related incidents leading to reduced attention to safety.

An increase in complacency means that an organization’s safety culture has somehow changed.  Causes of such change include shifts in the organization’s underlying assumptions and decay.

Underlying Assumptions

We know from the Schein model that underlying assumptions are the bedrock for culture.  One can take those underlying assumptions and construct an (incomplete) mental model of the organization—what it values, how it operates and how it makes decisions.  Over time, as the organization builds an apparently successful safety record, the mental weights that people assign to decision factors can undergo a subtle but persistent shift to favor the visible production and cost goals over the inherently invisible safety factor.  At the same time, opportunities exist for corrosive issues, e.g., normalization of deviance, to attach themselves to the underlying assumptions.  Normalization of deviance can manifest anywhere, from slipping maintenance standards to a greater tolerance for increasing work backlogs.

Decay

An organization’s safety culture will inevitably decay over time absent effective maintenance.  In part this is caused by the shift in underlying assumptions.  In addition, decay results from saturation effects.  Saturation occurs because beating people over the head with either the same thing, e.g., espoused values, or too many different things, e.g., one safety program or similar intervention after another, has lower and lower marginal effectiveness over time.  That’s one reason new leaders are brought in to “problem” plants: to boost the safety culture by using a new messenger with a different version of the message, reset the decision making factor weights and clear the backlogs.

None of this is new to regular readers of this blog.  But we wanted to gather our ideas about complacency in one post.  Complacency is not some free-floating “thing,” it is an organizational trait that emerges because of multiple dynamics operating below the level of clear visibility or measurement.  

     
*  G.B. Jaczko, Prepared Remarks at the Institute of Nuclear Power Operations CEO Conference, Atlanta, GA (Nov. 10, 2011), p. 2, ADAMS Accession Number ML11318A134.

Tuesday, November 9, 2010

Human Beings . . . Conscious Decisions

In a  New York Times article* dated November 8, 2010, there was a headline to the effect that Fred Bartlit, the independent investigator for the presidential panel on the BP oil rig disaster earlier this year had not found that “cost trumped safety” in decisions leading up to the accident.  The article noted that this finding contradicted determinations by other investigators including those sponsored by Congress.  We had previously posted on this subject, including taking notice of the earlier findings of cost trade-offs, and wanted to weigh in based on this new information.

First we should acknowledge that we have no independent knowledge of the facts associated with the blowout and are simply reacting to the published findings of current investigations.  In our prior posts we had posited that cost pressures could be part of the equation in the leadup to the spill.  On June 8, 2010 we observed:

“...it is clear that the environment leading up to the blowout included fairly significant schedule and cost pressures. What is not clear at this time is to what extent those business pressures contributed to the outcome. There are numerous cited instances where best practices were not followed and concerns or recommendations for prudent actions were brushed aside. One wishes the reporters had pursued this issue in more depth to find out ‘Why?’ ”

And we recall one of the initial observations made by an OSHA official shortly after the accident as detailed in our April 26, 2010 post:

“In the words of an OSHA official BP still has a ‘serious, systemic safety problem’ across the company.”

So it appears we have been cautious in reaching any conclusions about BP’s safety management.  That said, we do want to put into context the finding by Mr. Bartlit.  First we would note that he is, by profession, a trial lawyer and may be both approaching the issue and articulating his finding with a decidedly legal focus.  The specific quotes attributed to him are as follows:

“. . . we have not found a situation where we can say a man had a choice between safety and dollars and put his money on dollars” and “To date we have not seen a single instance where a human being made a conscious decision to favor dollars over safety,...”

It is not surprising that a lawyer would focus on culpability in terms of individual actions.  When things go wrong, most industries, nuclear included, look to assign blame to individuals and move on.  It is also worth noting that the investigator emphasized that no one had made a “conscious” decision to favor cost over safety.  We think it is important to keep in mind that safety management and failures of safety decision making may or may not involve conscious decisions.  As we have stated many times in other posts, safety can be undermined through very subtle mechanisms such that even those involved may not appreciate the effects, e.g., the normalization of deviance.  Finally we think the OSHA investigator may have been closer to the truth with his observation about “systemic” safety problems.  It may be that Mr. Bartlit, and other investigators, will be found to have suffered from what is termed “attribution error” where simple explanations and causes are favored and the more complex system-based dynamics are not fully assessed or understood in the effort to answer “Why?”  

* J.M. Broder, "Investigator Finds No Evidence That BP Took Shortcuts to Save Money," New York Times (Nov 8, 2010).

Thursday, April 22, 2010

Classroom in the Sky

The opening of much of the European airspace in the last several days has provided a rare opportunity to observe in real time some dynamics of safety decision making. On the one hand there have been the airlines who have been contending it is safe to resume flights and on the other the regulatory authorities who had been taking a more conservative stance. The question posed in our prior post was to what extent were business pressures influencing the airlines position and to what extent might that pressure, perhaps transmuting into political pressure, influence regulators decisions. Perhaps most importantly, how could one tell?

We have extracted some interesting quotes from recent media reporting of the issue.





Civil aviation officials said their decision to reopen terminals where thousands of weary travelers had camped out was based on science, not on the undeniable pressure put on them by the airlines....’The only priority that we consider is safety. We were trying to assess the safe operating levels for aircraft engines with ash,’ said Eamonn Brennan, chief executive of Irish Aviation Authority. “Pressure to restart flights had been intense.” Despite their protests, the timing of some reopenings seemed dictated by airlines' commercial pressures.

"It's important to realize that we've never experienced in Europe something like this before....We needed the four days of test flights,the empirical data, to put this together and to understand the levels of ash that engines can absorb. Even as airports reopened, a debate swirled about the safety of flying without more extensive analysis of the risks, as it appeared that governments were operating without consistent international guidelines based on solid data. "What's missing is some sort of standard, based on science, that gives an indication of a safe level of volcanic ash..." “Some safety experts said pressure from hard-hit airlines and stranded passengers had prompted regulators to venture into uncharted territory with respect to the ash. In the past, the key was simply to avoid ash plumes.”

How can it be both ways - regulators did not respond to pressure or regulators only acted based on new analysis of data? The answer may lie in our old friend, the theory of normalization of deviation. ref Challenger Launch Decision). As we have discussed in prior posts normalization is a process whereby an organization’s safety standards maybe reinterpreted (to a lower or more accommodating level)over time due to complex interactions of cultural, organizational, regulatory and environmental factors. The fascinating aspects are that this process is not readily apparent to those involved and decisions then made in accordance with the reinterpreted standards are not viewed as deviant or inconsistent with, e.g., “safety is our highest priority”. Thus events which heretofore were viewed as discrepant,no longer are and are thus “normalized”. Aviation authorities believe their decisions are entirely safety-based. Yet to observers outside the organization it very much appears that the bar has been lowered, since what is considered safe today was not yesterday.

Wednesday, March 10, 2010

"Normalization of a Deviation"

These are the words of John Carlin, Vice President at the Ginna Nuclear Plant, referring to a situation in the past where chronic water leakages from the reactor refueling pit were tolerated by the plant’s former owners. 

The quote is from a piece reported by Energy & Environment Publishing’s Peter Behr in its ClimateWire online publication titled, “Aging Reactors Put Nuclear Power Plant ‘Safety Culture’ in the Spotlight” and also published in The New York Times.  The focus is on a series of incidents with safety culture implications that have occurred at the Nine Mile Point and Ginna plants now owned and operated by Constellation Energy.

The recitation of events and the responses of managers and regulators are very familiar.  The drip, drip, drip is not the sound of water leaking but the uninspired give and take of the safety culture dialogue that occurs each time there is an incident or series of incidents that suggest safety culture is not working as it should.

Managers admit they need to adopt a questioning attitude and improve the rigor of decision making; ensure they have the right “mindset”; and corporate promises “a campaign to make sure its employees across the company buy into the need for an exacting attention to safety.”  Regulators remind the licensee, "The nuclear industry remains ... just one incident away from retrenchment..." but must be wondering why these events are occurring when NRC performance indicators for the plants and INPO rankings do not indicate problems.  Pledges to improve safety culture are put forth earnestly and (I believe) in good faith.

The drip, drip, drip of safety culture failures may not be cause for outright alarm or questioning of the fundamental safety of nuclear operations, but it does highlight what seems to be a condition of safety culture stasis - a standoff of sorts where significant progress has been made but problems continue to arise, and the same palliatives are applied.  Perhaps more significantly, where continued evolution of thinking regarding safety culture has plateaued.  Peaking too early is a problem in politics and sports, and so it appears in nuclear safety culture.

This is why the remark by John Carlin was so refreshing.  For those not familiar with the context of his words, “normalization of deviation” is a concept developed by Diane Vaughan in her exceptional study of the space shuttle Challenger accident.  Readers of this blog will recall that we are fans her book, The Challenger Launch Decision, where a mechanism she identifies as “normalization of deviance” is used to explain the gradual acceptance of performance results that are outside normal acceptance criteria.  Most scary, an organization's standards can decay and no one even notices.  How this occurs and what can be done about it are concepts that should be central to current considerations of safety culture. 

For further thoughts from our blog on this subject, refer to our posts dated October 6, 2009 and November 12, 2009.  In the latter, we discuss the nature of complacency and its insidious impact on the very process that is designed to avoid it in the first place.

Tuesday, October 6, 2009

Social Licking?

The linked file contains a book review with some interesting social science that could be of great relevance to building and sustaining safety cultures.  But I couldn’t resist the best quote of the review, commenting about some of the unusual findings in recent studies of social networks.  To wit,

“In fact, the model that best predicted the network structure of U.S. senators was that of social licking among cows.”

Back on topic, the book is Connected by Nicholas Christakis and James Fowler, addressing the surprising power of social networks and how they shape our lives.  The authors may be best known for a study published several years ago about how obesity could be contagious.  It is based on observations of networked relationships – friends and friends of friends – that can lead to individuals modeling their behaviors based on those to whom they are connected.

“What is the mechanism whereby your friend’s friend’s obesity is likely to make you fatter? Partly, it’s a kind of peer pressure, or norming, effect, in which certain behaviors, or the social acceptance of certain behaviors,
get transmitted across a network of acquaintances.”  Sounds an awful lot like how we think of safety culture being spread across an organization.  For those of you who have been reading this blog, you may recall that we are fans of Diane Vaughan’s book The Challenger Launch Decision, where a mechanism she identifies as “normalization of deviance” is used to explain the gradual acceptance of performance results that are outside normal acceptance criteria.  An organization's standards decay and no one even notices.


The book review goes on to note, “Mathematical models of flocks of birds, or colonies of ants, or schools of fish reveal that while there is no central controlling director telling the birds to fly one direction or another, a collective intelligence somehow emerges, so that all the birds fly in the same direction at the same time.  Christakis and Fowler argue that through network science we are discovering the same principle at work in humans — as individuals, we are part of a superorganism, a hivelike network that shapes our decisions.”  I guess the key is to ensure that the hive takes the workers in the right direction.

Question:  Does the above observation that “there is no central controlling director” telling the right direction have implications for nuclear safety management?  Is leadership the key or development of a collective intelligence?

 
Link to review.
 

Monday, August 3, 2009

Reading List: Just Culture by Sidney Dekker

Thought I would share with you a relatively recent addition to the safety management system bookshelf, Just Culture by Sidney Dekker, Professor of Human Factors and System Safety at Lund University in Sweden.  In Dekker’s view a “just culture” is critical for the creation of safety culture.  A just culture will not simply assign blame in response to a failure or problem, it will seek to use accountability as a means to understand the system-based contributors to failure and resolve those in a manner that will avoid recurrence.  One of the reasons we believe so strongly in safety simulation is the emphasis on system-based understanding, including a shared organizational mental model of how safety management happens.  One reviewer (D. Sillars) of this book on the amazon.com website summarizes, “’Just culture’ is an abstract phrase, which in practice, means . . . getting to an account of failure that can both satisfy demands for accountability while contributing to learning and improvement.” 


Question for nuclear professionals:  Does your organization maintain a library of resources such as Just Culture or Dianne Vaughan’s book, The Challenger Launch Decision, that provide deep insights into organizational performance and culture?  Are materials like this routinely the subject of discussions in training sessions and topical meetings?

Thursday, July 30, 2009

“Reliability is a Dynamic Non-Event” (MIT #5)

What is this all about?  Reliability is a dynamic non-event [MIT paper pg 5].  It is about complacency.  Paradoxically, when incident rates are low for an extended period of time and if management does not maintain a high priority on safety, the organization may slip into complacency as individuals shift their attention to other priorities such as production pressures.  The MIT authors note the parallel to the NASA space program where incidents were rare notwithstanding a weak safety culture, resulting in the organization rationalizing its performance as “normal”.  (See Dianne Vaughan’s book The Challenger Launch Decision for a compelling account of NASA’s organizational dynamics.)  In our paper “Practicing Nuclear Safety Management” we make a similar comparison.

What does this imply about the nuclear industry?  Certainly we are in a period where the reliability of the plants is at a very high level and the NRC ROP indicator board is very green.  Is this positive for maintaining high safety culture levels or does it represent a potential threat?  It could be the latter since the biggest problem in addressing the safety implications of complacency in an organization is, well, complacency.