This
book was mentioned in a nuclear safety discussion forum so we figured this is a good time to revisit Dörner's 1989 tome.* Below we
provide a summary of the book followed by our assessment of how it
fits into our interest in decision making and the use of simulations
in training.
Dörner's
work focuses on why people fail to make good decisions when faced
with problems and challenges. In particular, he is interested in the
psychological needs and coping mechanisms people exhibit. His
primary research method is observing test subjects interact with
simulation models of physical sub-worlds, e.g., a malfunctioning
refrigeration unit, an African tribe of subsistence farmers and
herdsmen, or a small English manufacturing city. He applies his
lessons learned to real situations, e.g, the Chernobyl nuclear plant
accident.
He
proposes a multi-step process for improving decision making in
complicated situations then describes each step in detail and the
problems people can create for themselves while executing the step.
These problems generally consist of tactics people adopt to
preserve their sense of competence and control at the expense of
successfully achieving overall objectives. Although the steps
are discussed in series, he recognizes that, at any point, one may
have to loop back through a previous step.
Goal
setting
Goals
should be concrete and specific to guide future steps. The
relationships between and among goals should be specified, including
dependencies, conflicts and relative importance. When people don't
to do this, they can become distracted by obvious or unimportant
(although potentially achievable) goals, or peripheral issues they
know how to address rather than important issues that should be
resolved. Facing performance failure, they may attempt to turn
failure into success with doublespeak or blame unseen forces.
Formulate
models and gather information
Good
decision-making requires an adequate mental model of the system being
studied—the variables that comprise the system and the functional
relationships among them, which may include positive and negative
feedback loops. The model's level of detail should be sufficient to
understand the interrelationships among the variables the decision
maker wants to influence. Unsuccessful test subjects were inclined
to use a “reductive hypothesis,” which unreasonably reduces the
model to a single key variable, or overgeneralization.
Information
gathered is almost always incomplete and the decision maker has to
decide when he has enough to proceed. The more successful test
subjects asked more questions and made fewer decisions (then the less
successful subjects) in the early time periods of the sim.
Predict
and extrapolate
Once a
model is formulated, the decision maker must attempt to determine how
the values of variables will change over time in response to his
decisions or internal system dynamics. One problem is predicting
that outputs will change in a linear fashion, even as the evidence
grows for a non-linear, e.g., exponential function. An exponential
variable may suddenly grow dramatically then equally suddenly reverse
course when the limits on growth (resources) are reached. Internal
time delays mean that the effects of a decision are not visible until
some time in the future. Faced with poor results, unsuccessful test
subjects implement or exhibit “massive countermeasures, ad hoc
hypotheses that ignore the actual data, underestimations of growth
processes, panic reactions, and ineffectual frenetic activity.” (p.
152) Successful subjects made an effort to understand the system's
dynamics, kept notes (history) on system performance and tried to
anticipate what would happen in the future.
Plan
and execute actions, check results and adjust strategy
“The
essence of planning is to think through the consequences of certain
actions and see whether those actions will bring us closer to our
desired goal.” (p. 153) Easier said than done in an environment of
too many alternative courses of action and too little time. In
rapidly evolving situations, it may be best to create rough plans and
delegate as many implementing decisions as possible to subordinates.
A major risk is thinking that planning has been so complete than the
unexpected cannot occur. A related risk is the reflexive use of
historically successful strategies. “As at Chernobyl, certain
actions carried out frequently in the past, yielding only the
positive consequences of time and effort saved and incurring no
negative consequences, acquire the status of an (automatically
applied) ritual and can contribute to catastrophe.” (p. 172)
In the
sims, unsuccessful test subjects often exhibited “ballistic”
behavior—they implemented decisions but paid no attention to, i.e,
did not learn from, the results. Successful subjects watched for the
effects of their decisions, made adjustments and learned from their
mistakes.
Dörner
identified several characteristics of people who tended to end up in
a failure situation. They failed to formulate their goals, didn't
recognize goal conflict or set priorities, and didn't correct their
errors. (p. 185) Their ignorance of interrelationships among system
variables and the longer-term repercussions of current decisions set
the stage for ultimate failure.
Assessment
Dörner's
insights and models have informed our thinking about human
decision-making behavior in demanding, complicated situations. His
use and promotion of simulation models as learning tools was one
starting point for Bob Cudlin's work in developing a nuclear
management training simulation program. Like Dörner, we see
simulation as a powerful tool to “observe and record the background
of planning, decision making, and evaluation processes that are
usually hidden.” (pp. 9-10)
However,
this book does not cover the entire scope of our interests. Dörner
is a psychologist interested in individuals, group behavior is beyond
his range. He alludes to normalization of deviance but his
references appear limited to the flaunting of safety rules rather
than a more pervasive process of slippage. More importantly, he does
not address behavior that arises from the system itself, in
particular adaptive behavior as an open system reacts to and
interacts with its environment.
From
our view, Dörner's suggestions may help the individual decision
maker avoid common pitfalls and achieve locally optimum
answers. On the downside, following Dörner's prescription might
lead the decision maker to an unjustified confidence in his overall
system management abilities. In a truly complex system, no one knows
how the entire assemblage works. It's sobering to note that even in
Dörner's closed,** relatively simple models many test subjects still
had a hard time developing a reasonable mental model, and some failed
completely.
This
book is easy to read and Dörner's insights into the psychological
traps that limit human decision making effectiveness remain useful.
* D.
Dörner, The Logic of Failure: Recognizing and Avoiding Error in
Complex Situations, trans. R. and R. Kimber (Reading, MA: Perseus
Books, 1998). Originally published in German in 1989.
**
One simulation model had an external input.
No comments:
Post a Comment
Thanks for your comment. We read them all. The moderator will publish comments that are related to our content.