Showing posts with label James Reason. Show all posts
Showing posts with label James Reason. Show all posts

Friday, May 3, 2013

High Reliability Organizations and Safety Culture

On February 10th, we posted about a report covering lessons for safety culture (SC) that can be gleaned from the social science literature. The report's authors judged that high reliability organization (HRO) literature provided a solid basis for linking individual and organizational assumptions with traits and practices that can affect safety performance. This post explores HRO characteristics and how they can influence SC.

Our source is Managing the Unexpected: Resilient Performance in an Age of Uncertainty* by Karl Weick and Kathleen Sutcliffe. Weick is a leading contemporary HRO scholar. This book is clearly written, with many pithy comments, so lots of quotations are included below to present the authors' views in their own words.

What makes an HRO different?

Many organizations work with risky technologies where the consequences of problems or errors can be catastrophic, use complex management systems and exist in demanding environments. But successful HROs approach their work with a different attitude and practices, an “ongoing mindfulness embedded in practices that enact alertness, broaden attention, reduce distractions, and forestall misleading simplifications.” (p. 3)

Mindfulness

An underlying assumption of HROs is “that gradual . . . development of unexpected events sends weak signals . . . along the way” (p. 63) so constant attention is required. Mindfulness means that “when people act, they are aware of context, of ways in which details differ . . . and of deviations from their expectations.” (p. 32) HROs “maintain continuing alertness to the unexpected in the face of pressure to take cognitive shortcuts.” (p. 19) Mindful organizations “notice the unexpected in the making, halt it or contain it, and restore system functioning.” (p. 21)

It takes a lot of energy to maintain mindfulness. As the authors warn us, “mindful processes unravel pretty fast.” (p. 106) Complacency and hubris are two omnipresent dangers. “Success narrows perceptions, . . . breeds overconfidence . . . and reduces acceptance of opposing points of view. . . . [If] people assume that success demonstrates competence, they are more likely to drift into complacency, . . .” (p. 52) Pressure in the task environment is another potential problem. “As pressure increases, people are more likely to search for confirming information and to ignore information that is inconsistent with their expectations.” (p. 26) The opposite of mindfulness is mindlessness. “Instances of mindlessness occur when people confront weak stimuli, powerful expectations, and strong desires to see what they expect to see.” (p. 88)

Mindfulness can lead to insight and knowledge. “In that brief interval between surprise and successful normalizing lies one of your few opportunities to discover what you don't know.” (p. 31)**

Five principles

HROs follow five principles. The first three cover anticipation of problems and the remaining two cover containment of problems that do arise.

Preoccupation with failure

HROs “treat any lapse as a symptom that something may be wrong with the system, something that could have severe consequences if several separate small errors happened to coincide. . . . they are wary of the potential liabilities of success, including complacency, the temptation to reduce margins of safety, and the drift into automatic processing.” (p. 9)

Managers usually think surprises are bad, evidence of bad planning. However, “Feelings of surprise are diagnostic because they are a solid cue that one's model of the world is flawed.” (p. 104) HROs “Interpret a near miss as danger in the guise of safety rather than safety in the guise of danger. . . . No news is bad news. All news is good news, because it means that the system is responding.” (p. 152)

People in HROs “have a good sense of what needs to go right and a clearer understanding of the factors that might signal that things are unraveling.” (p. 86)

Reluctance to simplify

HROs “welcome diverse experience, skepticism toward received wisdom, and negotiating tactics that reconcile differences of opinion without destroying the nuances that diverse people detect. . . . [They worry that] superficial similarities between the present and the past mask deeper differences that could prove fatal.” (p. 10) “Skepticism thus counteracts complacency . . . .” (p. 155) “Unfortunately, diverse views tend to be disproportionately distributed toward the bottom of the organization, . . .” (p. 95)

The language people use at work can be a catalyst for simplification. A person may initially perceive something different in the environment but using familiar or standard terms to communicate the experience can raise the risk of losing the early warnings the person perceived.

Sensitivity to operations

HROs “are attentive to the front line, . . . Anomalies are noticed while they are still tractable and can still be isolated . . . . People who refuse to speak up out of fear undermine the system, which knows less than it needs to know to work effectively.” (pp. 12-13) “Being sensitive to operations is a unique way to correct failures of foresight.” (p. 97)

In our experience, nuclear plants are generally good in this regard; most include a focus on operations among their critical success factors.

Commitment to resilience

“HROs develop capabilities to detect, contain, and bounce back from those inevitable errors that are part of an indeterminate world.” (p. 14) “. . . environments that HROs face are typically more complex than the HRO systems themselves. Reliability and resilience lie in practices that reduce . . . environmental complexity or increase system complexity.” (p. 113) Because it's difficult or impossible to reduce environmental complexity, the organization needs to makes its systems more complex.*** This requires clear thinking and insightful analysis. Unfortunately, actual organizational response to disturbances can fall short. “. . . systems often respond to a disturbance with new rules and new prohibitions designed to present the same disruption from happening in the future. This response reduces flexibility to deal with subsequent unpredictable changes.” (p. 72)

Deference to expertise.

“Decisions are made on the front line, and authority migrates to the people with the most expertise, regardless of their rank.” (p. 15) Application of expertise “emerges from a collective, cultural belief that the necessary capabilities lie somewhere in the system and that migrating problems [down or up] will find them.” (p. 80) “When tasks are highly interdependent and time is compressed, decisions migrate down . . . Decisions migrate up when events are unique, have potential for very serious consequences, or have political or career ramifications . . .” (p. 100)

This is another ideal that can fail in practice. We've all seen decisions made by the highest ranking person rather than the most qualified one. In other words, “who is right” can trump “what is right.”

Relationship to safety culture

Much of the chapter on culture is based on the ideas of Schein and Reason so we'll focus on key points emphasized by Weick and Sutcliffe. In their view, “culture is something an organization has [practices and controls] that eventually becomes something an organization is [beliefs, attitudes, values].” (p. 114, emphasis added)

“Culture consists of characteristic ways of knowing and sensemaking. . . . Culture is about practices—practices of expecting, managing disconfirmations, sensemaking, learning, and recovering.” (pp. 119-120) A single organization can have different types of culture: an integrative culture that everyone shares, differentiated cultures that are particular to sub-groups and fragmented cultures that describe individuals who don't fit into the first two types. Multiple cultures support the development of more varied responses to nascent problems.

A complete culture strives to be mindful, safe and informed with an emphasis on wariness. As HRO principles are ingrained in an organization, they become part of the culture. The goal is a strong SC that reinforces concern about the unexpected, is open to questions and reporting of failures, views close calls as a failure, is fearful of complacency, resists simplifications, values diversity of opinions and focuses on imperfections in operations.

What else is in the book?

One chapter contains a series of audits (presented as survey questions) to assess an organization's mindfulness and appreciation of the five principles. The audits can show an organization's attitudes and capabilities relative to HROs and relative to its own self-image and goals.

The final chapter describes possible “small wins” a change agent (often an individual) can attempt to achieve in an effort to move his organization more in line with HRO practices, viz., mindfulness and the five principles. For example, “take your team to the actual site where an unexpected event was handled either well or poorly, walk everyone through the decision making that was involved, and reflect on how to handle that event more mindfully.” (p. 144)

The book's case studies include an aircraft carrier, a nuclear power plant,**** a pediatric surgery center and wildland firefighting.

Our perspective

Weick and Sutcliffe draw on the work of many other scholars, including Constance Perin, Charles Perrow, James Reason and Diane Vaughan, all of whom we have discussed in this blog. The book makes many good points. For example, the prescription for mindfulness and the five principles can contribute to an effective context for decision making although it does not comprise a complete management system. The authors' recognize that reliability does not mean a complete lack of performance variation, instead reliability follows from practices that recognize and contain emerging problems. Finally, there is evidence of a systems view, which we espouse, when the authors say “It is this network of relationships taken together—not necessarily any one individual or organization in the group—that can also maintain the big picture of operations . . .” (p. 142)

The authors would have us focus on nascent problems in operations, which is obviously necessary. But another important question is what are the faint signals that the SC is developing problems? What are the precursors to the obvious signs, like increasing backlogs of safety-related work? Could that “human error” that recently occurred be a sign of a SC that is more forgiving of growing organizational mindlessness?

Bottom line: Safetymatters says check out Managing the Unexpected and consider adding it to your library.


* K.E. Weick and K.M. Sutcliffe, Managing the Unexpected: Resilient Performance in an Age of Uncertainty, 2d ed. (San Francisco, CA: Jossey-Bass, 2007). Also, Wikipedia has a very readable summary of HRO history and characteristics.

** More on normalization and rationalization: “On the actual day of battle naked truths may be picked up for the asking. But by the following morning they have already begun to get into their uniforms.” E.A. Cohen and J. Gooch, Military Misfortunes: The Anatomy of Failure in War (New York: Vintage Books, 1990), p. 44, quoted in Managing the Unexpected, p. 31.

*** The prescription to increase system complexity to match the environment is based on the system design principle of requisite variety which means “if you want to cope successfully with a wide variety of inputs, you need a wide variety of responses.” (p. 113)

**** I don't think the authors performed any original research on nuclear plants. But the studies they reviewed led them to conclude that “The primary threat to operations in nuclear plants is the engineering culture, which places a higher value on knowledge that is quantitative, measurable, hard, objective, and formal . . . HROs refuse to draw a hard line between knowledge that is quantitative and knowledge that is qualitative.” (p. 60)

Sunday, February 10, 2013

Safety Culture - Lessons from the Social Science Literature

In 2011 the NRC contracted with the Pacific Northwest National Laboratory to conduct a review of social science literature related to safety culture (SC) and methods for evaluating interventions proposed to address issues identified during SC assessments.  The resultant report* describes how traits such as leadership, trust, respect, accountability, and continuous learning are discussed in the literature. 

The report is heavily academic but not impenetrable and a good reference work on organizational culture theory and research.  I stumbled on this report in ADAMS and don't know why it hasn't had wider distribution.  Perhaps it's seen as too complicated or, more importantly, doesn't exactly square with the NRC/NEI/industry Weltanschauung when the authors say things like:  

“There is no simple recipe for developing safety culture interventions or for assessing the likelihood that these interventions will have the desired effects.” (p. 2)

“The literature consistently emphasizes that effecting directed behavioral, cognitive, or cultural change in adults and within established organizations is challenging and difficult, requires persistence and energy, and is frequently unsuccessful.” (p. 7)

This report contains an extensive review of the literature and it is impossible to summarize in a blog post.  We'll provide an overview of the content, focusing on interesting quotes and highlights, then revisit Schein's model and close with our two cents worth.

Concept of safety culture

This section begins with the definition of SC and the nine associated traits in the NRC SC policy statement, and compares them with other organizations' (IAEA, NEI, DOE et al) efforts. 

The Schein model is proposed as a way to understand “why things are as they are” as a starting point upon which to build change strategies aimed at improving organizational performance.  An alternative approach is to define the characteristics of an ideal SC, then evaluate how much the target organization differs from the ideal, and use closing the gap as the objective for corrective strategies.  The NEI approach to SC assessment reflects the second conceptual model.  A third approach, said to bridge the difference between the first two, is proposed by holistic thinkers such as Reason who focus on overall organizational culture. 

This is not the usual “distinction without a difference” argument that academics often wage.  Schein's objective is to improve organizational performance; the idealists' objective is to make an organization correspond to the ideal model with an assumption that desired performance will follow. 

The authors eventually settle on the high reliability organization (HRO) literature as providing the best basis for linking individual and organizational assumptions with traits and mechanisms for affecting safety performance.  Why?  The authors say the HRO approach identifies some of the specific mechanisms that link elements of a culture to safety outcomes and identifies important relationships among the cultural elements. (p. 15)  A contrary explanation is that the authors wanted to finesse their observation that Schein (beloved by NRC) and NEI have different views of the the basis that should be used for designing SC improvement initiatives.

Building blocks of culture 


The authors review the “building blocks” of culture, highlighting areas that correspond to the NRC safety culture traits.  If an organization wants to change its culture, it needs to decide which building blocks to address and how to make and sustain changes.

Organizational characteristics that correspond to NRC SC traits include leadership, communication, work processes, and problem identification and resolution.  Leadership and communication are recognized as important in the literature and are discussed at length.  However, the literature review offered thin gruel in the areas of work processes, and problem identification and resolution; in other words, the connections between these traits and SC are not well-defined. (pp. 20-25)

There is an extensive discussion of other building blocks including perceptions, values, attitudes, norms**, beliefs, motivations, trust, accountability and respect.  Implications for SC assessment and interventions are described, where available.  Adaptive processes such as sense making and double-loop learning are also mentioned.

Change and change management

The authors review theories of individual and organizational change and change management.  They note that planned interventions need to consider other changes that may be occurring because of dynamic processes between the organization and its environment and within the organization itself.

Many different models for understanding and effecting organizational change are described.  As the authors summarize: “. . . change is variously seen as either pushed by problems or pulled by visions or goals; as purposive and volitional or inadvertent and emergent; as a one-time event or a continuous process. It is never seen as easy or simple.” (p. 43)

The authors favor Montaño and Kaspryzk’s Integrated Behavioral Model, shown in the figure below, as a template for designing and evaluating SC interventions.  It's may be hard to read here but suffice to say a lot of factors go into an individual's decision to perform a new behavior and most or all of these factors should be considered by architects of SC interventions.  Leadership can provide input to many of these factors (through communication, modeling desired behavior, including decision making) and thus facilitate (or impede) desired behavioral changes.



From Montaño and Kaspryzk
Resistance to change can be wide-spread.  Effective leadership is critical to overcoming resistance and implementing successful cultural changes.  “. . . leaders in formal organizations have the power and responsibility to set strategy and direction, align people and resources, motivate and inspire people, and ensure that problems are identified and solved in a timely manner.” (p. 54)

Lessons from initiatives to create other specific organizational cultures

The authors review the literature on learning organizations, total quality management and quality organizations, and sustainable organizations for lessons applicable to SC initiatives.  They observe that this literature “is quite consistent in emphasizing the importance of recognizing that organizations are multi-level, dynamic systems whose elements are related in complex and multi-faceted ways, and that culture mirrors this dynamic complexity, despite its role in socializing individuals, maintaining stability, and resisting change.” (p. 61)

“The studies conducted on learning, quality, and sustainable organizations and their corresponding cultures contain some badly needed information about the relationship among various traits, organizational characteristics, and behaviors that could help inform the assessment of safety cultures and the design and evaluation of interventions.” (p. 65)  Topics mentioned include management leadership and commitment, trust, respect, shared vision and goals, and a supportive learning environment.

Designing and evaluating targeted interventions 


This section emphasizes the potential value of the evaluation science*** approach (used primarily in health care) for the nuclear industry.  The authors go through the specific steps for implementing the evaluation science model, drilling down in spots to describe additional tools, such as logic modeling (to organize and visualize issues, interventions and expected outcomes), that can be used.  There is a lot of detail here including suggestions for how the NRC might use backward mapping and a review of licensee logic models to evaluate SC assessment and intervention efforts.  Before anyone runs off to implement this approach, there is a major caveat:

“The literature on the design, implementation, and evaluation of interventions to address identified shortcomings in an organization’s safety culture is sparse; there is more focus on creating a safety culture than on intervening to correct identified problems.” (p. 67)

Relation to Schein

Schein's model of culture (shown on p. 8) and prescriptions for interventions are the construct most widely known to the nuclear industry and its SC practitioners.  His work is mentioned throughout the PNNL report.  Schein assumes that cultural change is a top-down effort (so leadership plays a key role) focused on individuals.  Change is implemented using an unfreeze—replace/move—refreeze strategy.  Schein's model is recommended in the program theory-driven evaluation science approach.  The authors believe Schein's “description of organizational culture and change does one of the best jobs of conveying the “cultural” dimensions in a way that conveys its embeddedness and complexity.” (p. 108)  The authors note that Schein's cultural levels interact in complex ways, requiring a systems approach that relates the levels to each other, SC to the larger organizational culture, and culture to overall organizational functioning.

So if you're acquainted with Schein you've got solid underpinnings for reading this report even if you've never heard of any of the over 300 principal authors (plus public agencies and private entities) mentioned therein.  If you want an introduction to Schein, we have posted on his work here and here.

Conclusion

This is a comprehensive and generally readable reference work.  SC practitioners should read the executive summary and skim the rest to get a feel for the incredible number of theorists, researchers and institutions who are interested in organizational culture in general and/or SC in particular.  The report will tell you what a culture consists of and how you might go about changing it.

We have a few quibbles.  For example, there are many references to systems but very little to what we call systems thinking (an exception is Senge's mention of systems thinking on p. 58 and systems approach on p. 59).  There is no recognition of the importance of feedback loops.

The report refers multiple times to the dynamic interaction of the factors that comprise a SC but does not provide any model of those interactions.  There is limited connectivity between potentially successful interventions and desired changes in observable artifacts.  In other words, this literature review will not tell you how to improve your plant's decision making process or corrective action program, resolve goal conflicts or competing priorities, align management incentives with safety performance, or reduce your backlogs.


*  K.M. Branch and J.L. Olson, “Review of the Literature Pertinent to the Evaluation of Safety Culture Interventions” (Richland, WA: Pacific Northwest National Laboratory, Dec. 2011).  ADAMS ML13023A054

**  The authors note “The NRC safety culture traits could also be characterized as social norms.” (p. 28)

***  “. . . evaluation science focuses on helping stakeholders diagnose organization and social needs, design interventions, monitor intervention implementation, and design and implement an evaluation process to measure and assess the intended and unintended consequences that result as the intervention is implemented.” (p. 69)

Monday, August 16, 2010

SafetyMatters

“Jim Reason defines a safety culture as consisting of constellations of practices, most importantly, concerning reporting and learning. A safety culture, he argues, is both a reporting culture and a learning culture. Where safety is an organisation’s top priority, the organisation will aim is to assemble as much relevant information as possible, circulate it, analyse it, and apply.” *

This quote comes from a paper [see cite below] we recently posted about and reminds us of our purpose in writing this blog.

We have posted about Reason’s insights into safety culture and find his work to be particularly useful.  We hesitate to exploit such insights for our own purposes but we can hardly resist noting that one of the core purposes of our SafetyMatters Blog is to share and disseminate useful thinking about safety management issues.  We wonder if nuclear operating organizations take advantage of these materials and assure their dissemination within their organizations.  Ditto for nuclear regulators and other industry organizations.  We observe the traffic to the blog and note that individuals within such organizations are regular visitors.  We hope they are not the exceptions or the only people within their organizations to have exposure to the exchange of ideas here at SafetyMatters. 

Comments are always welcome.

*  Andrew Hopkins, "Studying Organisational Cultures and their Effects on Safety," paper prepared for presentation to the International Conference on Occupational Risk Prevention, Seville, May 2006 (National Research Centre for OHS Regulation, Australian National University), p. 16.

Wednesday, March 17, 2010

Dr. James Reason on Error Management

This podcast is excerpted from an interview of Dr. James Reason in association with VoiceMap (www.voicemap.net), a provider of live guidance applications to improve human performance.  In this first segment, Dr. Reason discusses his theory of how errors occur (person based and system based) including the existence of "error traps" within an organizational system.  Error traps are evident when different people make the same error, indicating some defect in the management system, such as something as simple as bad or ambiguous procedures.  

I believe error traps may also exist due to a more intangible conditions such as conflicting priorities or requirements on staff that may create a bias toward compromise of safety priorities.  The conditions act as a trap or decision "box" where safety compromise is either viewed as "okay" or the only viable response and even well intentioned people can be subverted.  In contrast, the competing priorities may just appear to be boxes and allow a lax person to compromise safety.  The bias toward compromising safety may actually originate in people who are predisposed to making the error, making it not a system-based error trap but a personal performance error.  How should the errors in reporting at Vermont Yankee be characterized?

Monday, March 15, 2010

Vermont Yankee (part 2) - What Would Reason Say?

The "Reason" in the title refers to Dr. James Reason, Professor Emeritus, Department of Psychology, University of Manchester.

“It is clear from in-depth accident analyses that some of the most powerful pushes towards local traps [characteristics of the workplace that lead people to compromise safety priorities] come from an unsatisfactory resolution of the inevitable conflict that exists (at least in the short-term) between the goals of safety and production. The cultural accommodation between the pursuit of these goals must achieve a delicate balance. On the one hand, we have to face the fact that no organization is just in the business of being safe.  Every company must obey both the ' ALARP ' principle (keep the risks as low as reasonably practicable) and the 'ASSIB' principle (and still stay in business). On the other hand, it is now increasingly clear that few organizations can survive a catastrophic organizational accident (Reason 1997).”

"Achieving a Safe Culture: Theory and Practice." (1998), p. 301.
 

Dr. Reason has been a leading and influential thinker in the area of safety and risk management in the workplace and the creation of safety culture in high risk industries.  Get to know Dr. Reason through his own words in future blog posts featuring some of his key insights.