On
February 10th, we posted about a report covering lessons
for safety culture (SC) that can be gleaned from the social science
literature. The report's authors judged that high reliability
organization (HRO) literature provided a solid basis for linking
individual and organizational assumptions with traits and practices
that can affect safety performance. This post explores HRO
characteristics and how they can influence SC.
Our
source is Managing the Unexpected: Resilient Performance in an Age
of Uncertainty* by Karl
Weick and Kathleen Sutcliffe. Weick is a leading contemporary HRO
scholar. This book is clearly written, with many pithy comments, so lots of quotations are
included below to present the authors' views in their own words.
What
makes an HRO different?
Many
organizations work with risky technologies where the consequences of
problems or errors can be catastrophic, use complex management
systems and exist in demanding environments. But successful HROs
approach their work with a different attitude and practices, an
“ongoing mindfulness embedded in practices that enact alertness,
broaden attention, reduce distractions, and forestall misleading
simplifications.” (p. 3)
Mindfulness
An
underlying assumption of HROs is “that gradual . . . development of
unexpected events sends weak signals . . . along the way” (p. 63)
so constant attention is required. Mindfulness means that “when
people act, they are aware of context, of ways in which details
differ . . . and of deviations from their expectations.” (p. 32)
HROs “maintain continuing alertness to the unexpected in the face
of pressure to take cognitive shortcuts.” (p. 19) Mindful
organizations “notice the unexpected in the making, halt it or
contain it, and restore system functioning.” (p. 21)
It
takes a lot of energy to maintain mindfulness. As the authors warn
us, “mindful processes unravel pretty fast.” (p. 106)
Complacency and hubris are two omnipresent dangers. “Success
narrows perceptions, . . . breeds overconfidence . . . and reduces
acceptance of opposing points of view. . . . [If] people assume that
success demonstrates competence, they are more likely to drift into
complacency, . . .” (p. 52) Pressure in the task environment is
another potential problem. “As pressure increases, people are more
likely to search for confirming information and to ignore information
that is inconsistent with their expectations.” (p. 26) The
opposite of mindfulness is mindlessness. “Instances of
mindlessness occur when
people confront weak stimuli, powerful expectations, and strong
desires to see what they expect to see.” (p. 88)
Mindfulness
can lead to insight and knowledge. “In that brief
interval between surprise and successful normalizing lies one of your
few opportunities to discover what you don't know.” (p. 31)**
Five
principles
HROs
follow five principles. The first three cover anticipation of
problems and the remaining two cover containment of problems that do
arise.
Preoccupation
with failure
HROs
“treat any lapse as a symptom that something may be wrong with the
system, something that could have severe consequences if several
separate small errors happened to coincide. . . . they are wary of
the potential liabilities of success, including complacency, the
temptation to reduce margins of safety, and the drift into automatic
processing.” (p. 9)
Managers
usually think surprises are bad, evidence of bad planning. However,
“Feelings of surprise are diagnostic because they are a solid cue
that one's model of the world is flawed.” (p. 104) HROs “Interpret
a near miss as danger in the guise of safety rather than safety in
the guise of danger. . . . No news is bad news. All news is good
news, because it means that the system is responding.” (p. 152)
People
in HROs “have a good sense of what needs to go right and a clearer
understanding of the factors that might signal that things are
unraveling.” (p. 86)
Reluctance
to simplify
HROs
“welcome diverse experience, skepticism toward received wisdom, and
negotiating tactics that reconcile differences of opinion without
destroying the nuances that diverse people detect. . . . [They worry
that] superficial similarities between the present and the past mask
deeper differences that could prove fatal.” (p. 10) “Skepticism
thus counteracts complacency . . . .” (p. 155) “Unfortunately,
diverse views tend to be disproportionately distributed toward the
bottom of the organization, . . .” (p. 95)
The
language people use at work can be a catalyst for simplification. A
person may initially perceive something different in the environment
but using familiar or standard terms to communicate the experience
can raise the risk of losing the early warnings the person perceived.
Sensitivity
to operations
HROs
“are attentive to the front line, . . . Anomalies are noticed while
they are still tractable and can still be isolated . . . . People who
refuse to speak up out of fear undermine the system, which knows less
than it needs to know to work effectively.” (pp. 12-13) “Being
sensitive to operations is a unique way to correct failures of
foresight.” (p. 97)
In
our experience, nuclear plants are generally good in this regard;
most include a focus on operations among their critical success
factors.
Commitment
to resilience
“HROs
develop capabilities to detect, contain, and bounce back from those
inevitable errors that are part of an indeterminate world.” (p. 14)
“. . . environments that HROs face are typically more complex than
the HRO systems themselves. Reliability and resilience lie in
practices that reduce . . . environmental complexity or increase
system complexity.” (p. 113) Because it's difficult or impossible
to reduce environmental complexity, the organization needs to makes
its systems more complex.*** This requires clear thinking and
insightful analysis. Unfortunately, actual organizational response
to disturbances can fall short. “. . . systems often respond to a
disturbance with new rules and new prohibitions designed to present
the same disruption from happening in the future. This response
reduces flexibility to deal with subsequent unpredictable
changes.” (p. 72)
Deference
to expertise.
“Decisions
are made on the front line, and authority migrates to the people with
the most expertise, regardless of their rank.” (p. 15) Application
of expertise “emerges from a collective, cultural belief that the
necessary capabilities lie somewhere in the system and that migrating
problems [down or up] will find them.” (p. 80) “When tasks are
highly interdependent and time is compressed, decisions migrate down
. . . Decisions migrate up when events are unique, have potential for
very serious consequences, or have political or career ramifications
. . .” (p. 100)
This
is another ideal that can fail in practice. We've all seen decisions
made by the highest ranking person rather than the most qualified
one. In other words, “who is right” can trump
“what is right.”
Relationship
to safety culture
Much
of the chapter on culture is based on the ideas of Schein and Reason
so we'll focus on key points emphasized by Weick and Sutcliffe. In
their view, “culture is something an organization has
[practices and controls] that eventually becomes something an
organization is [beliefs, attitudes, values].” (p.
114, emphasis added)
“Culture
consists of characteristic ways of knowing and sensemaking. . . .
Culture is about practices—practices of expecting, managing
disconfirmations, sensemaking, learning, and recovering.” (pp.
119-120) A single organization can have different types of culture:
an integrative culture that everyone shares, differentiated cultures
that are particular to sub-groups and fragmented cultures that
describe individuals who don't fit into the first two types.
Multiple cultures support the development of more varied responses to
nascent problems.
A
complete culture strives to be mindful, safe and informed with an
emphasis on wariness. As HRO principles are ingrained in an
organization, they become part of the culture. The goal is a strong
SC that reinforces concern about the unexpected, is open to questions
and reporting of failures, views close calls as a failure, is fearful
of complacency, resists simplifications, values diversity of opinions
and focuses on imperfections in operations.
What
else is in the book?
One
chapter contains a series of audits (presented as survey questions)
to assess an organization's mindfulness and appreciation of the five
principles. The audits can show an organization's attitudes and
capabilities relative to HROs and relative to its own self-image and
goals.
The
final chapter describes possible “small wins” a change agent
(often an individual) can attempt to achieve in an effort to move his
organization more in line with HRO practices, viz., mindfulness and
the five principles. For example, “take your team to the actual
site where an unexpected event was handled either well or poorly,
walk everyone through the decision making that was involved, and
reflect on how to handle that event more mindfully.” (p. 144)
The
book's case studies include an aircraft carrier, a nuclear power
plant,**** a pediatric surgery center and wildland firefighting.
Our
perspective
Weick
and Sutcliffe draw on the work of many other scholars, including
Constance Perin, Charles Perrow, James Reason and Diane Vaughan, all
of whom we have discussed in this blog. The book makes many good
points. For example, the prescription for mindfulness and the five
principles can contribute to an effective context for decision making
although it does not comprise a complete management system. The
authors' recognize that reliability does not mean a complete lack of
performance variation, instead reliability follows from practices
that recognize and contain emerging problems. Finally, there is
evidence of a systems view, which we espouse, when the authors say
“It is this network of relationships taken together—not
necessarily any one individual or organization in the group—that
can also maintain the big picture of operations . . .” (p. 142)
The
authors would have us focus on nascent problems in operations, which
is obviously necessary. But another important question is what are
the faint signals that the SC is developing problems? What are the
precursors to the obvious signs, like increasing backlogs of
safety-related work? Could that “human error” that recently
occurred be a sign of a SC that is more forgiving of growing
organizational mindlessness?
Bottom
line: Safetymatters says check out Managing the Unexpected
and consider adding it to your library.
*
K.E. Weick and K.M. Sutcliffe, Managing the Unexpected: Resilient
Performance in an Age of Uncertainty, 2d ed. (San Francisco, CA:
Jossey-Bass, 2007). Also, Wikipedia has a very readable summary of HRO history and characteristics.
**
More on normalization and rationalization: “On the actual day of
battle naked truths may be picked up for the asking. But by the
following morning they have already begun to get into their
uniforms.” E.A. Cohen and J. Gooch, Military Misfortunes: The
Anatomy of Failure in War (New York: Vintage Books, 1990), p. 44,
quoted in Managing the Unexpected, p. 31.
***
The prescription to increase system complexity to match the
environment is based on the system design principle of requisite
variety which means “if you want to cope successfully with a wide
variety of inputs, you need a wide variety of responses.” (p. 113)
****
I don't think the authors performed any original research on nuclear
plants. But the studies they reviewed led them to conclude that “The
primary threat to operations in nuclear plants is the engineering
culture, which places a higher value on knowledge that is
quantitative, measurable, hard, objective, and formal . . . HROs
refuse to draw a hard line between knowledge that is quantitative and
knowledge that is qualitative.” (p. 60)