Showing posts with label Schein. Show all posts
Showing posts with label Schein. Show all posts

Friday, May 3, 2013

High Reliability Organizations and Safety Culture

On February 10th, we posted about a report covering lessons for safety culture (SC) that can be gleaned from the social science literature. The report's authors judged that high reliability organization (HRO) literature provided a solid basis for linking individual and organizational assumptions with traits and practices that can affect safety performance. This post explores HRO characteristics and how they can influence SC.

Our source is Managing the Unexpected: Resilient Performance in an Age of Uncertainty* by Karl Weick and Kathleen Sutcliffe. Weick is a leading contemporary HRO scholar. This book is clearly written, with many pithy comments, so lots of quotations are included below to present the authors' views in their own words.

What makes an HRO different?

Many organizations work with risky technologies where the consequences of problems or errors can be catastrophic, use complex management systems and exist in demanding environments. But successful HROs approach their work with a different attitude and practices, an “ongoing mindfulness embedded in practices that enact alertness, broaden attention, reduce distractions, and forestall misleading simplifications.” (p. 3)

Mindfulness

An underlying assumption of HROs is “that gradual . . . development of unexpected events sends weak signals . . . along the way” (p. 63) so constant attention is required. Mindfulness means that “when people act, they are aware of context, of ways in which details differ . . . and of deviations from their expectations.” (p. 32) HROs “maintain continuing alertness to the unexpected in the face of pressure to take cognitive shortcuts.” (p. 19) Mindful organizations “notice the unexpected in the making, halt it or contain it, and restore system functioning.” (p. 21)

It takes a lot of energy to maintain mindfulness. As the authors warn us, “mindful processes unravel pretty fast.” (p. 106) Complacency and hubris are two omnipresent dangers. “Success narrows perceptions, . . . breeds overconfidence . . . and reduces acceptance of opposing points of view. . . . [If] people assume that success demonstrates competence, they are more likely to drift into complacency, . . .” (p. 52) Pressure in the task environment is another potential problem. “As pressure increases, people are more likely to search for confirming information and to ignore information that is inconsistent with their expectations.” (p. 26) The opposite of mindfulness is mindlessness. “Instances of mindlessness occur when people confront weak stimuli, powerful expectations, and strong desires to see what they expect to see.” (p. 88)

Mindfulness can lead to insight and knowledge. “In that brief interval between surprise and successful normalizing lies one of your few opportunities to discover what you don't know.” (p. 31)**

Five principles

HROs follow five principles. The first three cover anticipation of problems and the remaining two cover containment of problems that do arise.

Preoccupation with failure

HROs “treat any lapse as a symptom that something may be wrong with the system, something that could have severe consequences if several separate small errors happened to coincide. . . . they are wary of the potential liabilities of success, including complacency, the temptation to reduce margins of safety, and the drift into automatic processing.” (p. 9)

Managers usually think surprises are bad, evidence of bad planning. However, “Feelings of surprise are diagnostic because they are a solid cue that one's model of the world is flawed.” (p. 104) HROs “Interpret a near miss as danger in the guise of safety rather than safety in the guise of danger. . . . No news is bad news. All news is good news, because it means that the system is responding.” (p. 152)

People in HROs “have a good sense of what needs to go right and a clearer understanding of the factors that might signal that things are unraveling.” (p. 86)

Reluctance to simplify

HROs “welcome diverse experience, skepticism toward received wisdom, and negotiating tactics that reconcile differences of opinion without destroying the nuances that diverse people detect. . . . [They worry that] superficial similarities between the present and the past mask deeper differences that could prove fatal.” (p. 10) “Skepticism thus counteracts complacency . . . .” (p. 155) “Unfortunately, diverse views tend to be disproportionately distributed toward the bottom of the organization, . . .” (p. 95)

The language people use at work can be a catalyst for simplification. A person may initially perceive something different in the environment but using familiar or standard terms to communicate the experience can raise the risk of losing the early warnings the person perceived.

Sensitivity to operations

HROs “are attentive to the front line, . . . Anomalies are noticed while they are still tractable and can still be isolated . . . . People who refuse to speak up out of fear undermine the system, which knows less than it needs to know to work effectively.” (pp. 12-13) “Being sensitive to operations is a unique way to correct failures of foresight.” (p. 97)

In our experience, nuclear plants are generally good in this regard; most include a focus on operations among their critical success factors.

Commitment to resilience

“HROs develop capabilities to detect, contain, and bounce back from those inevitable errors that are part of an indeterminate world.” (p. 14) “. . . environments that HROs face are typically more complex than the HRO systems themselves. Reliability and resilience lie in practices that reduce . . . environmental complexity or increase system complexity.” (p. 113) Because it's difficult or impossible to reduce environmental complexity, the organization needs to makes its systems more complex.*** This requires clear thinking and insightful analysis. Unfortunately, actual organizational response to disturbances can fall short. “. . . systems often respond to a disturbance with new rules and new prohibitions designed to present the same disruption from happening in the future. This response reduces flexibility to deal with subsequent unpredictable changes.” (p. 72)

Deference to expertise.

“Decisions are made on the front line, and authority migrates to the people with the most expertise, regardless of their rank.” (p. 15) Application of expertise “emerges from a collective, cultural belief that the necessary capabilities lie somewhere in the system and that migrating problems [down or up] will find them.” (p. 80) “When tasks are highly interdependent and time is compressed, decisions migrate down . . . Decisions migrate up when events are unique, have potential for very serious consequences, or have political or career ramifications . . .” (p. 100)

This is another ideal that can fail in practice. We've all seen decisions made by the highest ranking person rather than the most qualified one. In other words, “who is right” can trump “what is right.”

Relationship to safety culture

Much of the chapter on culture is based on the ideas of Schein and Reason so we'll focus on key points emphasized by Weick and Sutcliffe. In their view, “culture is something an organization has [practices and controls] that eventually becomes something an organization is [beliefs, attitudes, values].” (p. 114, emphasis added)

“Culture consists of characteristic ways of knowing and sensemaking. . . . Culture is about practices—practices of expecting, managing disconfirmations, sensemaking, learning, and recovering.” (pp. 119-120) A single organization can have different types of culture: an integrative culture that everyone shares, differentiated cultures that are particular to sub-groups and fragmented cultures that describe individuals who don't fit into the first two types. Multiple cultures support the development of more varied responses to nascent problems.

A complete culture strives to be mindful, safe and informed with an emphasis on wariness. As HRO principles are ingrained in an organization, they become part of the culture. The goal is a strong SC that reinforces concern about the unexpected, is open to questions and reporting of failures, views close calls as a failure, is fearful of complacency, resists simplifications, values diversity of opinions and focuses on imperfections in operations.

What else is in the book?

One chapter contains a series of audits (presented as survey questions) to assess an organization's mindfulness and appreciation of the five principles. The audits can show an organization's attitudes and capabilities relative to HROs and relative to its own self-image and goals.

The final chapter describes possible “small wins” a change agent (often an individual) can attempt to achieve in an effort to move his organization more in line with HRO practices, viz., mindfulness and the five principles. For example, “take your team to the actual site where an unexpected event was handled either well or poorly, walk everyone through the decision making that was involved, and reflect on how to handle that event more mindfully.” (p. 144)

The book's case studies include an aircraft carrier, a nuclear power plant,**** a pediatric surgery center and wildland firefighting.

Our perspective

Weick and Sutcliffe draw on the work of many other scholars, including Constance Perin, Charles Perrow, James Reason and Diane Vaughan, all of whom we have discussed in this blog. The book makes many good points. For example, the prescription for mindfulness and the five principles can contribute to an effective context for decision making although it does not comprise a complete management system. The authors' recognize that reliability does not mean a complete lack of performance variation, instead reliability follows from practices that recognize and contain emerging problems. Finally, there is evidence of a systems view, which we espouse, when the authors say “It is this network of relationships taken together—not necessarily any one individual or organization in the group—that can also maintain the big picture of operations . . .” (p. 142)

The authors would have us focus on nascent problems in operations, which is obviously necessary. But another important question is what are the faint signals that the SC is developing problems? What are the precursors to the obvious signs, like increasing backlogs of safety-related work? Could that “human error” that recently occurred be a sign of a SC that is more forgiving of growing organizational mindlessness?

Bottom line: Safetymatters says check out Managing the Unexpected and consider adding it to your library.


* K.E. Weick and K.M. Sutcliffe, Managing the Unexpected: Resilient Performance in an Age of Uncertainty, 2d ed. (San Francisco, CA: Jossey-Bass, 2007). Also, Wikipedia has a very readable summary of HRO history and characteristics.

** More on normalization and rationalization: “On the actual day of battle naked truths may be picked up for the asking. But by the following morning they have already begun to get into their uniforms.” E.A. Cohen and J. Gooch, Military Misfortunes: The Anatomy of Failure in War (New York: Vintage Books, 1990), p. 44, quoted in Managing the Unexpected, p. 31.

*** The prescription to increase system complexity to match the environment is based on the system design principle of requisite variety which means “if you want to cope successfully with a wide variety of inputs, you need a wide variety of responses.” (p. 113)

**** I don't think the authors performed any original research on nuclear plants. But the studies they reviewed led them to conclude that “The primary threat to operations in nuclear plants is the engineering culture, which places a higher value on knowledge that is quantitative, measurable, hard, objective, and formal . . . HROs refuse to draw a hard line between knowledge that is quantitative and knowledge that is qualitative.” (p. 60)

Thursday, March 7, 2013

Schein at INPO in 2003



In November 2003 Professor Edgar Schein gave a speech at the INPO CEO conference.*  It was not a lengthy academic lecture but his focus on managing culture, as opposed to changing or creating it, was interesting.  At the time Schein was doing some work for ConEd and had a notion of nuclear plant culture, which he divided into four sub-cultures: engineering, hourly, operator and executive, each with its own underlying assumptions and values.

The engineering culture emphasizes elegant, possibly expensive designs that minimize the role of error-prone humans.  Engineers want and value respect from other engineers, including those outside the plant (an external orientation). 

The hourly culture (which I think means maintenance) values teamwork and has an experience-based perspective on safety.  They want job security, fair wages, good equipment, adequate training and respect from their peers and supervisors.

The operator culture values teamwork and open communications.  They see the invaluable contributions they make to keeping the plant running safely and efficiently.  They want the best equipment, training and to be recognized for their contributions.

The executive culture is about money.  They want productivity, cost control, safety and good relations with their boards of directors (another external orientation).

These sub-cultures are in conflict because they all can't have everything they want.  The executive needs to acknowledge that cultural differences exist and each sub-culture brings certain strengths to the table.  The executive's role is to create a climate of mutual respect and to work toward aligning the sub-cultures to achieve common goals, e.g., safety.  The executive should not be trying to impose the values of a single sub-culture on everyone else.  In other words, the executive should be a culture manager, not a culture changer.

This was a brief speech and I don't want to read too much into it.  There are dysfunctional or no longer appropriate cultures and they have to be reworked, i.e., changed.  But if many things are working OK, then build on the existing strengths.**

This was not a speech about cultural interventions.  At the beginning, Schein briefly described his tri-level cultural model and noted if the observed artifacts match the espoused values, then there's no need to analyze the underlying assumptions.  This is reminiscent of Commissioner Apostolakis' comment that “. . . we really care about what people do and maybe not why they do it . . . .”


*  E.H. Schein, “Keeping the Edge: Enhancing Performance Through Managing Culture,” speech at INPO CEO Conference (Nov. 7, 2003).  I came across this speech while reviewing the resources listed for a more contemporary DOE conference.

**  Focusing on strengths (and not wasting resources trying to shore up weaknesses unless they constitute a strategic threat) is a management prescription first promoted by Peter Drucker.

Sunday, February 10, 2013

Safety Culture - Lessons from the Social Science Literature

In 2011 the NRC contracted with the Pacific Northwest National Laboratory to conduct a review of social science literature related to safety culture (SC) and methods for evaluating interventions proposed to address issues identified during SC assessments.  The resultant report* describes how traits such as leadership, trust, respect, accountability, and continuous learning are discussed in the literature. 

The report is heavily academic but not impenetrable and a good reference work on organizational culture theory and research.  I stumbled on this report in ADAMS and don't know why it hasn't had wider distribution.  Perhaps it's seen as too complicated or, more importantly, doesn't exactly square with the NRC/NEI/industry Weltanschauung when the authors say things like:  

“There is no simple recipe for developing safety culture interventions or for assessing the likelihood that these interventions will have the desired effects.” (p. 2)

“The literature consistently emphasizes that effecting directed behavioral, cognitive, or cultural change in adults and within established organizations is challenging and difficult, requires persistence and energy, and is frequently unsuccessful.” (p. 7)

This report contains an extensive review of the literature and it is impossible to summarize in a blog post.  We'll provide an overview of the content, focusing on interesting quotes and highlights, then revisit Schein's model and close with our two cents worth.

Concept of safety culture

This section begins with the definition of SC and the nine associated traits in the NRC SC policy statement, and compares them with other organizations' (IAEA, NEI, DOE et al) efforts. 

The Schein model is proposed as a way to understand “why things are as they are” as a starting point upon which to build change strategies aimed at improving organizational performance.  An alternative approach is to define the characteristics of an ideal SC, then evaluate how much the target organization differs from the ideal, and use closing the gap as the objective for corrective strategies.  The NEI approach to SC assessment reflects the second conceptual model.  A third approach, said to bridge the difference between the first two, is proposed by holistic thinkers such as Reason who focus on overall organizational culture. 

This is not the usual “distinction without a difference” argument that academics often wage.  Schein's objective is to improve organizational performance; the idealists' objective is to make an organization correspond to the ideal model with an assumption that desired performance will follow. 

The authors eventually settle on the high reliability organization (HRO) literature as providing the best basis for linking individual and organizational assumptions with traits and mechanisms for affecting safety performance.  Why?  The authors say the HRO approach identifies some of the specific mechanisms that link elements of a culture to safety outcomes and identifies important relationships among the cultural elements. (p. 15)  A contrary explanation is that the authors wanted to finesse their observation that Schein (beloved by NRC) and NEI have different views of the the basis that should be used for designing SC improvement initiatives.

Building blocks of culture 


The authors review the “building blocks” of culture, highlighting areas that correspond to the NRC safety culture traits.  If an organization wants to change its culture, it needs to decide which building blocks to address and how to make and sustain changes.

Organizational characteristics that correspond to NRC SC traits include leadership, communication, work processes, and problem identification and resolution.  Leadership and communication are recognized as important in the literature and are discussed at length.  However, the literature review offered thin gruel in the areas of work processes, and problem identification and resolution; in other words, the connections between these traits and SC are not well-defined. (pp. 20-25)

There is an extensive discussion of other building blocks including perceptions, values, attitudes, norms**, beliefs, motivations, trust, accountability and respect.  Implications for SC assessment and interventions are described, where available.  Adaptive processes such as sense making and double-loop learning are also mentioned.

Change and change management

The authors review theories of individual and organizational change and change management.  They note that planned interventions need to consider other changes that may be occurring because of dynamic processes between the organization and its environment and within the organization itself.

Many different models for understanding and effecting organizational change are described.  As the authors summarize: “. . . change is variously seen as either pushed by problems or pulled by visions or goals; as purposive and volitional or inadvertent and emergent; as a one-time event or a continuous process. It is never seen as easy or simple.” (p. 43)

The authors favor MontaƱo and Kaspryzk’s Integrated Behavioral Model, shown in the figure below, as a template for designing and evaluating SC interventions.  It's may be hard to read here but suffice to say a lot of factors go into an individual's decision to perform a new behavior and most or all of these factors should be considered by architects of SC interventions.  Leadership can provide input to many of these factors (through communication, modeling desired behavior, including decision making) and thus facilitate (or impede) desired behavioral changes.



From MontaƱo and Kaspryzk
Resistance to change can be wide-spread.  Effective leadership is critical to overcoming resistance and implementing successful cultural changes.  “. . . leaders in formal organizations have the power and responsibility to set strategy and direction, align people and resources, motivate and inspire people, and ensure that problems are identified and solved in a timely manner.” (p. 54)

Lessons from initiatives to create other specific organizational cultures

The authors review the literature on learning organizations, total quality management and quality organizations, and sustainable organizations for lessons applicable to SC initiatives.  They observe that this literature “is quite consistent in emphasizing the importance of recognizing that organizations are multi-level, dynamic systems whose elements are related in complex and multi-faceted ways, and that culture mirrors this dynamic complexity, despite its role in socializing individuals, maintaining stability, and resisting change.” (p. 61)

“The studies conducted on learning, quality, and sustainable organizations and their corresponding cultures contain some badly needed information about the relationship among various traits, organizational characteristics, and behaviors that could help inform the assessment of safety cultures and the design and evaluation of interventions.” (p. 65)  Topics mentioned include management leadership and commitment, trust, respect, shared vision and goals, and a supportive learning environment.

Designing and evaluating targeted interventions 


This section emphasizes the potential value of the evaluation science*** approach (used primarily in health care) for the nuclear industry.  The authors go through the specific steps for implementing the evaluation science model, drilling down in spots to describe additional tools, such as logic modeling (to organize and visualize issues, interventions and expected outcomes), that can be used.  There is a lot of detail here including suggestions for how the NRC might use backward mapping and a review of licensee logic models to evaluate SC assessment and intervention efforts.  Before anyone runs off to implement this approach, there is a major caveat:

“The literature on the design, implementation, and evaluation of interventions to address identified shortcomings in an organization’s safety culture is sparse; there is more focus on creating a safety culture than on intervening to correct identified problems.” (p. 67)

Relation to Schein

Schein's model of culture (shown on p. 8) and prescriptions for interventions are the construct most widely known to the nuclear industry and its SC practitioners.  His work is mentioned throughout the PNNL report.  Schein assumes that cultural change is a top-down effort (so leadership plays a key role) focused on individuals.  Change is implemented using an unfreeze—replace/move—refreeze strategy.  Schein's model is recommended in the program theory-driven evaluation science approach.  The authors believe Schein's “description of organizational culture and change does one of the best jobs of conveying the “cultural” dimensions in a way that conveys its embeddedness and complexity.” (p. 108)  The authors note that Schein's cultural levels interact in complex ways, requiring a systems approach that relates the levels to each other, SC to the larger organizational culture, and culture to overall organizational functioning.

So if you're acquainted with Schein you've got solid underpinnings for reading this report even if you've never heard of any of the over 300 principal authors (plus public agencies and private entities) mentioned therein.  If you want an introduction to Schein, we have posted on his work here and here.

Conclusion

This is a comprehensive and generally readable reference work.  SC practitioners should read the executive summary and skim the rest to get a feel for the incredible number of theorists, researchers and institutions who are interested in organizational culture in general and/or SC in particular.  The report will tell you what a culture consists of and how you might go about changing it.

We have a few quibbles.  For example, there are many references to systems but very little to what we call systems thinking (an exception is Senge's mention of systems thinking on p. 58 and systems approach on p. 59).  There is no recognition of the importance of feedback loops.

The report refers multiple times to the dynamic interaction of the factors that comprise a SC but does not provide any model of those interactions.  There is limited connectivity between potentially successful interventions and desired changes in observable artifacts.  In other words, this literature review will not tell you how to improve your plant's decision making process or corrective action program, resolve goal conflicts or competing priorities, align management incentives with safety performance, or reduce your backlogs.


*  K.M. Branch and J.L. Olson, “Review of the Literature Pertinent to the Evaluation of Safety Culture Interventions” (Richland, WA: Pacific Northwest National Laboratory, Dec. 2011).  ADAMS ML13023A054

**  The authors note “The NRC safety culture traits could also be characterized as social norms.” (p. 28)

***  “. . . evaluation science focuses on helping stakeholders diagnose organization and social needs, design interventions, monitor intervention implementation, and design and implement an evaluation process to measure and assess the intended and unintended consequences that result as the intervention is implemented.” (p. 69)

Friday, October 5, 2012

The Corporate Culture Survival Guide by Edgar Shein

Our September 21, 2012 post introduced a few key elements of Prof. Edgar Schein’s “mental model” of organizational culture.  Our focus in that post was to decry how Schein’s basic construct of culture had been adopted by the nuclear industry but then twisted to fit company and regulatory desires for simple-minded mechanisms for assessing culture and cultural interventions.

In this post, we want to expand on Schein’s model of what culture is, how it can be assessed, and how its evolution can be influenced by management initiatives.  Where appropriate, we will provide our perspective based on our beliefs and experience.  All the quotes below come from Schein’s The Corporate Culture Survival Guide.*

What is Culture?

Schein’s familiar model shows three levels of culture: artifacts, espoused values and underlying assumptions.  In his view, the real culture is the bottom level: “Culture is the shared tacit assumptions of a group that have been learned through coping with external tasks and dealing with internal relationships.” (p. 217)  The strength of an organization’s culture is a function of the intensity of shared experiences and the relative success the organization has achieved.  “Culture . . . influences how you think and feel as well as how you act.” (p. 75)  Culture is thus a product of social learning. 

Our view does not conflict with Schein’s.  In our systems approach, culture is a variable that provides context for, but does not solely determine, organizational and individual decisions. 

How can Culture be Assessed?

Surveys

“You cannot use a survey to assess culture.” (p. 219)  The specific weaknesses of surveys are discussed elsewhere (pp. 78-80) but his bottom line is good enough for us.  We agree completely.

Interviews

Individual interviews can be used when interviewees would be inhibited in a group setting but Schein tries to avoid them in favor of group interviews because the latter are more likely to correctly identify the true underlying assumptions. 

In contrast, the NEI and IAEA safety culture evaluation protocols use interviews extensively, and we’ve commented on them here and here

Group discussion 


Schein’s recommended method for deciphering a company’s culture is a facilitated group exercise that attempts to identify the deeper (real) assumptions that drive the creation of artifacts by looking at conflicts between the artifacts and the espoused values. (pp. 82-87)   

How can Culture be Influenced?

In Schein’s view, culture cannot be directly controlled but managers can influence and evolve a culture.  In fact, “Managing cultural evolution is one of the primary tasks of leadership.” (p. 219)

His basic model for cultural change is creating the motivation to change, followed by learning and then internalizing new concepts, meanings and standards. (p. 106).  This can be a challenging effort; resistance to change is widespread, especially if the organization has been successful in the past.  Implementing change involves motivating people to change by increasing their survival anxiety or guilt; then promoting new ways of thinking, which can lead to learning anxiety (fear of loss or failure).  Learning anxiety can be ameliorated by increasing the learner’s psychological safety by using multiple steps, including training, role models and consistent systems and structures.  Our promotion of simulation is based on our belief that simulation can provide a platform for learners to practice new behaviors in a controlled and forgiving setting.

If time is of the essence or major transformational change is necessary, then the situation requires the removal and replacement of the key cultural carriers.  Replacement of management team members has often occurred at nuclear plants to address perceived performance/culture issues.
 
Schein says employees can be coerced into behaving differently but they will only internalize the new ways of doing business if the new behavior leads to better outcomes.  That may be true but we tend toward a more pragmatic approach and agree with Commissioner Apostolakis when he said: “. . . we really care about what people do and maybe not why they do it . . . .”

Bottom Line
Prof. Schein has provided a powerful model for visualizing organizational culture and we applaud his work.  Our own modeling efforts incorporate many of his factors, although not always in the same words.  In addition, we consider other factors that influence organizational behavior and feed back into culture, e.g., the priorities and resources provided by a corporate parent.


*  E.H. Schein, The Corporate Culture Survival Guide, new and revised ed. (San Francisco: Jossey-Bass, 2009).  

Friday, September 21, 2012

SafetyMatters and the Schein Model of Culture

A reader recently asked: “Do you subscribe to Edgar Schein's culture model?”  The short-form answer is a qualified “Yes.”  Prof. Schein has developed significant and widely accepted insights into the structure of organizational culture.  In its simplest form, his model of culture has three levels: the organization’s (usually invisible) underlying beliefs and assumptions, its espoused values, and its visible artifacts such as behavior and performance.  He describes the responsibility of management, through its leadership, to articulate the espoused values with policies and strategies and thus shape culture to align with management’s vision for the organization.  Schein’s is a useful mental model for conceptualizing culture and management responsibilities.*     

However, we have issues with the way some people have applied his work to safety culture.  For starters, there is the apparent belief that these levels are related in a linear fashion, more particularly, that management by promulgating and reinforcing the correct values can influence the underlying beliefs, and together they will guide the organization to deliver the desired behaviors, i.e., the target level of safety performance.  This kind of thinking has problems.

First, it’s too simplistic.  Safety performance doesn’t arise only because of management’s espoused values and what the rest of the organization supposedly believes.  As discussed in many of our posts, we see a much more complex, multidimensional and interactive system that yields outcomes which reflect, in greater or lesser terms, desired levels of safety.  We have suggested that it is the totality of such outcomes that is representative of the safety culture in fact.** 

Second, it leads to attempts to measure and influence safety culture that are often ineffective and even misleading.  We wonder whether the heavy emphasis on values and leadership attitudes and behaviors - or traits - that the Schein model encourages, creates a form versus substance trap.  This emphasis carries over to safety culture surveys - currently the linchpin for identifying and “correcting” deficient safety culture -  and even doubles down by measuring the perception of attitudes and behaviors.  While attitudes and behaviors may in fact have a beneficial effect on the organizational environment in which people perform - we view them as good habits - we are not convinced they are the only determinants of the actions, decisions and choices made by the organization.  Is it possible that this approach creates an organization more concerned with how it looks and how it is perceived than with what it does?   If everyone is checking their safety likeness in the cultural mirror might this distract from focusing on how and why actual safety-related decisions are being made?

We think there is good support for our skepticism.  For every significant safety event in recent years - the BP refinery fire, the Massey coal mine explosion, the shuttle disasters, the Deepwater oil rig explosion, and the many instances of safety culture issues at nuclear plants - the organization and senior management had been espousing as their belief that “safety is the highest priority.”  Clearly that was more illusion than reality.

To give a final upward thrust to the apple cart, we don’t think that the current focus on nuclear safety culture is primarily about culture.  Rather we see “safety culture” more as a proxy for management’s safety performance - and perhaps a back door for the NRC to regulate while disclaiming same.*** 


*  We have mentioned Prof. Schein in several prior blog posts: June 26, 2012, December 8, 2011, August 11, 2010, March 29, 2010, and August 17, 2009.

**  This past year we have posted several times on decisions as one type of visible result (artifact) of the many variables that influence organizational behavior.  In addition, please revisit two of Prof. Perin’s case studies, summarized here.  They describe well-intentioned people, who probably would score well on a safety culture survey, who made plant problems much worse through a series of decisions that had many more influences than management’s entreaties and staff’s underlying beliefs.

***  Back in 2006, the NRC staff proposed to enhance the ROP to more fully address safety culture, saying that “Safety culture includes . . . features that are not readily visible such as basic assumptions and beliefs of both managers and individuals, which may be at the root cause of repetitive and far-reaching safety performance problems.”  It wouldn’t surprise us if that’s an underlying assumption at the agency.  See L.A. Reyes to the Commissioners, SECY-06-0122 “Policy Issue Information: Safety Culture Initiative Activities to Enhance the Reactor Oversight Process and Outcomes of the Initiatives” (May 24, 2006) p. 7 ADAMS ML061320282.  

Tuesday, June 26, 2012

Modeling Safety Culture (Part 1)

Our June 12th post on the nature of decision making raised concerns about current perceptions of safety culture and the lack of a crisp mental model.  We contended that decisions were the critical manifestation of safety culture and should be understood as an ongoing process to achieve superior performance across all key organizational assets.  A recent post on LinkedIn by our friend Bill Mullins provided a real world example of this process from his days as a Rad Protection Manager.

“As a former Plant Radiation Protection Manager with lots of outage experience, my risk-balancing challenge arose across an evolving portfolio of work…We had to make allocations of finite human capital - radiation protection technicians, supervisors, and radiological engineers - day in a day out, in a way that matched the tempo of the ‘work proceeding safely.’"*

What would a model of safety culture look like?  In terms of a model that describes how safety culture is operationalized, there is not much to cite.  NEI has weighed in with a “safety culture process” diagram which may or may not be a model but includes elements such as CAP that one might expect to see in a model.  A fundamental consideration of any model is how to represent safety culture; does safety culture “determine” actions taken by an organization (a causal relationship), or just provide a context within which actions are taken, or is it really a product, or integration, of the actions taken?   

There is a very interesting overview of these issues in an article by M. D. Cooper titled, appropriately, “Toward a Model of Safety Culture.”  One intriguing assertion by the author is safety culture must be able to be managed and manipulated, contrary to many, including Schein, who take a different view (that it is inherent in the social system). (p. 116)  In another departure from Schein Cooper finds fault with a “linear” view of safety culture where attitudes directly result in behaviors. (p. 122)  Ultimately Cooper suggests an approach where reciprocal relationships between personal and situational aspects yield what we view as culture.  (This article is also worth a read for the observations about the limits of safety culture surveys and whether the goal of initiatives taken in response to surveys is improving safety culture—or improving safety culture survey results.)

Our own view is more in the direction of Cooper.  We think safety culture can be thought of as a force or pressure within the organization to ensure that actions and decisions reflect safety.  But safety competes with other forces arising from competing business goals, incentives and even personal interests.  The actual actions and decisions turn on the combined balance of these various pressures.***  Over time the integrated effect of the actions manifest the true priority of safety, and thus the safety culture.  

Such a process is not linear, thus to the question of does safety culture determine outcomes or vice versa, the answer is “yes”.  The diagram below illustrates the basic relationships between safety culture, management actions, business performance and safety performance. It is a cyclic and continuously looping process, driven by goals and modulated by results.  The basic idea is that safety culture exists in an equilibrium with safety and business performance much of the time.  However when business performance cannot meet its goals, it creates pressure on management and its ability to continue to give safety the appropriate priority.  (A larger figure with additional explanatory notes is available here.)




*  The link to the thread (including Bill's comment) is here.  This may be difficult for readers who are not LinkedIn members to access.

**  M.D. Cooper, “Toward a Model of Safety Culture,” Safety Science 36 (2000): 111-136.

*** As summarized in an MIT Sloan Management Review article we blogged about on Sept. 1, 2010, “All decisions….are values-based.  That is, a decision necessarily involves an implicit or explicit trade-off of values.”  Safety culture is merely one of the values that is involved in this computation.

Thursday, December 8, 2011

Nuclear Industry Complacency: Root Causes

NRC Chairman Jaczko, addressing the recent INPO CEO conference, warned about possible increasing complacency in the nuclear industry.*  To support his point, he noted the two plants in column four of the ROP Action Matrix and two plants in column three, the increased number of special inspections in the past year, and the three units in extended shutdowns.  The Chairman then moved on to discuss other industry issues. 

The speech spurred us to ask: Why does the risk of complacency increase over time?  Given our interest in analyzing organizational processes, it should come as no surprise that we believe complacency is more complicated than the lack of safety-related incidents leading to reduced attention to safety.

An increase in complacency means that an organization’s safety culture has somehow changed.  Causes of such change include shifts in the organization’s underlying assumptions and decay.

Underlying Assumptions

We know from the Schein model that underlying assumptions are the bedrock for culture.  One can take those underlying assumptions and construct an (incomplete) mental model of the organization—what it values, how it operates and how it makes decisions.  Over time, as the organization builds an apparently successful safety record, the mental weights that people assign to decision factors can undergo a subtle but persistent shift to favor the visible production and cost goals over the inherently invisible safety factor.  At the same time, opportunities exist for corrosive issues, e.g., normalization of deviance, to attach themselves to the underlying assumptions.  Normalization of deviance can manifest anywhere, from slipping maintenance standards to a greater tolerance for increasing work backlogs.

Decay

An organization’s safety culture will inevitably decay over time absent effective maintenance.  In part this is caused by the shift in underlying assumptions.  In addition, decay results from saturation effects.  Saturation occurs because beating people over the head with either the same thing, e.g., espoused values, or too many different things, e.g., one safety program or similar intervention after another, has lower and lower marginal effectiveness over time.  That’s one reason new leaders are brought in to “problem” plants: to boost the safety culture by using a new messenger with a different version of the message, reset the decision making factor weights and clear the backlogs.

None of this is new to regular readers of this blog.  But we wanted to gather our ideas about complacency in one post.  Complacency is not some free-floating “thing,” it is an organizational trait that emerges because of multiple dynamics operating below the level of clear visibility or measurement.  

     
*  G.B. Jaczko, Prepared Remarks at the Institute of Nuclear Power Operations CEO Conference, Atlanta, GA (Nov. 10, 2011), p. 2, ADAMS Accession Number ML11318A134.

Wednesday, August 11, 2010

Down Under Perspective on Surveys

Now from Australia we have come across more research results related to some of the key findings we discussed in our August 2, 2010 post “Mission Impossible”. Recall from that post that research comparing the results of safety surveys prior to a significant event at an offshore oil platform with post-event investigations, revealed significant differences in cultural attributes.

This 2006 paper* draws on a variety of other published works and the author’s own experience in analyzing major safety events. Note that the author refers to safety culture surveys as “perception surveys”, since they focus on people’s perceptions of attitudes, values and behaviors.

“The survey method is well suited to studying individual attitudes and values and it might be thought that the method is thereby biased in favour of a definition of culture in these terms. However, the survey method is equally suited to studying practices, or ‘the way we do things around here’. The only qualification is that survey research of “the way we do things around” here necessarily measures people’s perceptions rather than what actually happens, which may not necessarily coincide.” (p.5) As we have argued, and this paper agrees, it is actual behaviors and outcomes that are most important. The question is, can actual behaviors be discerned or predicted on the basis of surveys? The answer is not clear.

“The question of whether or how the cultures so identified [e.g., by culture surveys] impact on safety is a separate question. Mearns and co-workers argue that there is some, though rather limited, evidence that organisations which do well in safety climate surveys actually have fewer accidents” (p. 14 citing Mearns et al)**

I kind of liked a distinction made early on in the paper, that it is better to ascertain an organization’s “culture” and then assess the impact of that culture on safety, then to directly assess “safety culture”. This approach emphasizes the internal dynamics and the interaction of values and safety priorities with other competing business and environmental pressures. As this paper notes, “. . .the survey method tells us very little about dynamic processes - how the organisation goes about solving its problems. This is an important limitation. . . .Schein makes a similar point when he notes that members of a culture are most likely to reveal themselves when they have problems to solve. . . .(p. 6)

*  Andrew Hopkins, "Studying Organisational Cultures and their Effects on Safety," paper prepared for presentation to the International Conference on Occupational Risk Prevention, Seville, May 2006 (National Research Centre for OHS Regulation, Australian National University).

**  Mearns K, Whitaker S & Flin R, “Safety climate, safety management practices and safety performance in offshore environments”. Safety Science 41(8) 2003 (Oct) pp 641-680.

Monday, March 29, 2010

Well Done by NRC Staffer

To support the discussion items on this blog we spend time ferreting out interesting pieces of information that bear on the issue of nuclear safety culture and promote further thought within the nuclear community. This week brought us to the NRC website and its Key Topics area.

As probably most of you are aware, the NRC hosted a workshop in February of this year for further discussions of safety culture definitions. In general we believe that the amount of time and attention being given to definitional issues currently seems to be at the point of diminishing returns. When we examine safety culture performance issues that arise around the industry, it is not apparent that confusion over the definition of safety culture is a serious causal issue, i.e., that someone was thinking of the INPO definition of safety culture instead of the INSAG one or the Schein perspective. Perhaps it must be a step in the process but to us what is interesting, and of paramount importance, is what causes disconnects between safety beliefs and actions taken and what can be done about them?


Thus, it was heartening and refreshing to see a presentation that addressed the key issue of culture and actions head-on. Most definitions of safety culture are heavy on descriptions of commitment, values, beliefs and attributes and light on the actual behaviors and decisions people make everyday. However, the definition that caught our attention was:


“The values, attitudes, motivations and knowledge that affect the extent to which safety is emphasized over competing goals in decisions and behavior.”

(Dr. Valerie Barnes, USNRC, “What is Safety Culture”, Powerpoint presentation, NRC workshop on safety culture, February 2010, p. 13)

This definition acknowledges the existence of competing goals and the need to address the bottom line manifestation of culture: decisions and actual behavior. We would prefer “actions” to “behavior” as it appears that behavior is often used or meant in the context of process or state of mind. Actions, as with decisions, signify to us the conscious and intentional acts of individuals. The definition also focuses on result in another way - “the extent to which safety is emphasized . . . in decisions. . . .” [emphasis added] What counts is not just the act of emphasizing, i.e. stressing or highlighting, safety but the extent to which safety impacts decisions made, or actions taken.


For similar reasons we think Dr. Barnes' definition is superior to the definition that was the outcome of the workshop:


“Nuclear safety culture is the core values and behaviors resulting from a collective commitment by leaders and individuals to emphasize safety over competing goals to ensure protection of people and the environment.”


(Workshop Summary, March 12, 2010, ADAMS ACCESSION NUMBER ML100700065, p.2)


As we previously argued in a 2008 white paper:


“. . . it is hard to avoid the trap that beliefs may be definitive but decisions and actions often are much more nuanced. . . .


"First, safety management requires balancing safety and other legitimate business goals, in an environment where there are few bright lines defining what is adequately safe, and where there are significant incentives and penalties associated with both types of goals. As a practical matter, ‘Safety culture is fragile.....a balance of people, problems and pressures.’


"Second, safety culture in practice is “situational”, and is continually being re-interpreted based on people’s actual behaviors and decisions in the safety management process. Safety culture beliefs can be reinforced or challenged through the perception of each action (or inaction), yielding an impact on culture that can be immediate or incubate gradually over time.”


(Robert Cudlin, "Practicing Nuclear Safety Management," March 2008, p. 3)


We hope the Barnes definition gets further attention and helps inform this aspect of safety culture policy.

Monday, August 17, 2009

Safety Culture Assessment

A topic that we will visit regularly is the use of safety culture assessments to assign quantitative values to the condition of a specific organization and even the individual departments and working groups within the organization.  One reason for this focus is the emphasis on safety culture assessments as a response to situations where organizational performance does not meet expectations and “culture” is believed to be a factor.  Both the NRC and the nuclear industry appear aligned on the use of assessments as a response to performance issues and even as an ongoing prophylactic tool.  But, are these assessments useful?  Or accurate?  Do they provide insights into the origins of cultural deficiencies?

One question that frequently comes to mind is, can safety culture be separated from the manifestation of culture in terms of the specific actions and decisions taken by an organization?  For example, if an organization makes some decisions that are clearly at odds with “safety being the overriding priority”, can the culture of the organization not be deficient?  But if an assessment of the culture is performed, and the espoused beliefs and priorities are generally supportive of safety, what is to be made of those responses? 

The reference material for this post comes from some work led by the late Bernhard Wilpert of the Berlin University of Technology.  (We will sample a variety of his work in the safety management area in future posts.)   It is a brief slide presentation titled, “Challenges and Opportunities of Assessing Safety Culture”.  Slide 3 for example revisits E. H. Schein’s multi-dimensional formulation of safety culture which suggests that assessments must be able to expose all levels of culture and their integrated effect. 

Two observations from these slides seem of particular note.  They are both under Item 4, Methodological Challenges.  The first observation is that culture is not a quantifiable phenomenon and does not lend itself easily to benchmarking.  This bears consideration as most assessment methods being used today employ some statistical comparisons to assessments at other plants, including percentile type ranking.   The other observation in the slide is that culture results from the learning experience of its members.  This is of particular interest to us as it supports some of the thinking associated with a systems dynamics approach.  A systems view involves the development of shared “mental models” of how safety management “works”; the goal being that individual actions and decisions can be understood within a commonly understood framework.  The systems process becomes, in essence, the mechanism for translating beliefs into actions.


Link to slide presentation