Showing posts with label Dörner. Show all posts
Showing posts with label Dörner. Show all posts

Tuesday, June 9, 2015

Training....Yet Again

U.S. Navy SEALS in Training
We have beat the drum on the value of improved and innovative training techniques for improving safety management performance for some time.  Really since the inception of this blog where our paper, “Practicing Nuclear Safety Management,”* was one of the seminal perspectives we wanted to bring to our readers.  We continue to encounter knowledgeable sources that advocate practice-based approaches and so continue to bring them to our readers’ attention.  The latest is an article from the Harvard Business Review that calls attention to, and distinguishes, “training” as an essential dimension of organizational learning.  The article is “How the Navy SEALS Train for Leadership Excellence.”**  The author, Michael Schrage,*** is a research fellow at MIT who reached out to a former SEAL, Brandon Webb, who transformed SEAL training.  The author contends that training, as opposed to just education or knowledge, is necessary to promote deep understanding of a business or market or process.  Training in this sense refers to actually performing and practicing necessary skills.  It is the key to achieving high levels of performance in complex environments. 

One of Webb’s themes that really struck a chord was: “successful training must be dynamic, open and innovative…. ‘It’s every teacher’s job to be rigorous about constantly being open to new ideas and innovation’, Webb asserts.”  It is very hard to think about much of the training in the nuclear industry on safety culture and related issues as meeting these criteria.  Even the auto industry has recently stepped up to require the conduct of decision simulations to verify the effectiveness of corrective actions - in the wake of the ignition switch-related accidents. (see our
May 22, 2014 post.)

In particular the reluctance of the nuclear industry and its regulator to address the presence and impact of goal conflicts on safety continues to perplex us and, we hope, many others in the industry.   It was on the mind of Carlo Rusconi more than a year ago when he observed: “Some of these conflicts originate high in the organization and are not really amenable to training per se” (see our
Jan. 9, 2014 post.)  However a certain type of training could be very effective in neutralizing such conflicts - practicing making safety decisions against realistic fact-based scenarios.  As we have advocated on many occasions, this process would actualize safety culture principles in the context of real operational situations.  For the reasons cited by Rusconi it builds teamwork and develops shared viewpoints.  If, as we have also advocated, both operational managers and senior managers participated in such training, senior management would be on the record for its assessment of the scenarios including how they weighed, incorporated and assessed conflicting goals in their decisions.  This could have the salutary effect of empowering lower level managers to make tough calls where assuring safety has real impacts on other organizational priorities.  Perhaps senior management would prefer to simply preach goals and principles, and leave the tough balancing that is necessary to implement the goals to their management chain.  If decisions become shaded in the “wrong” direction but there are no bad outcomes, senior management looks good.  But if there is a bad outcome, lower level managers can be blamed, more “training” prescribed, and senior management can reiterate its “safety is the first priority” mantra.


*  In the paper we quote from an article that highlighted the weakness of “Most experts made things worse.  Those managers who did well gathered information before acting, thought in terms of complex-systems interactions instead of simple linear cause and effect, reviewed their progress, looked for unanticipated consequences, and corrected course often. Those who did badly relied on a fixed theoretical approach, did not correct course and blamed others when things went wrong.”  Wall Street Journal, Oct. 22, 2005, p. 10 regarding Dietrich Dörner’s book, The Logic of Failure.  For a comprehensive review of the practice of nuclear safety, see our paper “Practicing Nuclear Safety Management”, March 2008.

**  M. Schrage, "How the Navy SEALS Train for Leadership Excellence," Harvard Business Review (May 28, 2015).

***  Michael Schrage, a research fellow at MIT Sloan School’s Center for Digital Business, is the author of the book Serious Play among others.  Serious Play refers to experiments with models, prototypes, and simulations.

Thursday, December 20, 2012

The Logic of Failure by Dietrich Dörner

This book was mentioned in a nuclear safety discussion forum so we figured this is a good time to revisit Dörner's 1989 tome.* Below we provide a summary of the book followed by our assessment of how it fits into our interest in decision making and the use of simulations in training.

Dörner's work focuses on why people fail to make good decisions when faced with problems and challenges. In particular, he is interested in the psychological needs and coping mechanisms people exhibit. His primary research method is observing test subjects interact with simulation models of physical sub-worlds, e.g., a malfunctioning refrigeration unit, an African tribe of subsistence farmers and herdsmen, or a small English manufacturing city. He applies his lessons learned to real situations, e.g, the Chernobyl nuclear plant accident.

He proposes a multi-step process for improving decision making in complicated situations then describes each step in detail and the problems people can create for themselves while executing the step. These problems generally consist of tactics people adopt to preserve their sense of competence and control at the expense of successfully achieving overall objectives. Although the steps are discussed in series, he recognizes that, at any point, one may have to loop back through a previous step.

Goal setting

Goals should be concrete and specific to guide future steps. The relationships between and among goals should be specified, including dependencies, conflicts and relative importance. When people don't to do this, they can become distracted by obvious or unimportant (although potentially achievable) goals, or peripheral issues they know how to address rather than important issues that should be resolved. Facing performance failure, they may attempt to turn failure into success with doublespeak or blame unseen forces.

Formulate models and gather information

Good decision-making requires an adequate mental model of the system being studied—the variables that comprise the system and the functional relationships among them, which may include positive and negative feedback loops. The model's level of detail should be sufficient to understand the interrelationships among the variables the decision maker wants to influence. Unsuccessful test subjects were inclined to use a “reductive hypothesis,” which unreasonably reduces the model to a single key variable, or overgeneralization.

Information gathered is almost always incomplete and the decision maker has to decide when he has enough to proceed. The more successful test subjects asked more questions and made fewer decisions (then the less successful subjects) in the early time periods of the sim.

Predict and extrapolate

Once a model is formulated, the decision maker must attempt to determine how the values of variables will change over time in response to his decisions or internal system dynamics. One problem is predicting that outputs will change in a linear fashion, even as the evidence grows for a non-linear, e.g., exponential function. An exponential variable may suddenly grow dramatically then equally suddenly reverse course when the limits on growth (resources) are reached. Internal time delays mean that the effects of a decision are not visible until some time in the future. Faced with poor results, unsuccessful test subjects implement or exhibit “massive countermeasures, ad hoc hypotheses that ignore the actual data, underestimations of growth processes, panic reactions, and ineffectual frenetic activity.” (p. 152) Successful subjects made an effort to understand the system's dynamics, kept notes (history) on system performance and tried to anticipate what would happen in the future.

Plan and execute actions, check results and adjust strategy

The essence of planning is to think through the consequences of certain actions and see whether those actions will bring us closer to our desired goal.” (p. 153) Easier said than done in an environment of too many alternative courses of action and too little time. In rapidly evolving situations, it may be best to create rough plans and delegate as many implementing decisions as possible to subordinates. A major risk is thinking that planning has been so complete than the unexpected cannot occur. A related risk is the reflexive use of historically successful strategies. “As at Chernobyl, certain actions carried out frequently in the past, yielding only the positive consequences of time and effort saved and incurring no negative consequences, acquire the status of an (automatically applied) ritual and can contribute to catastrophe.” (p. 172)

In the sims, unsuccessful test subjects often exhibited “ballistic” behavior—they implemented decisions but paid no attention to, i.e, did not learn from, the results. Successful subjects watched for the effects of their decisions, made adjustments and learned from their mistakes.

Dörner identified several characteristics of people who tended to end up in a failure situation. They failed to formulate their goals, didn't recognize goal conflict or set priorities, and didn't correct their errors. (p. 185) Their ignorance of interrelationships among system variables and the longer-term repercussions of current decisions set the stage for ultimate failure.

Assessment

Dörner's insights and models have informed our thinking about human decision-making behavior in demanding, complicated situations. His use and promotion of simulation models as learning tools was one starting point for Bob Cudlin's work in developing a nuclear management training simulation program. Like Dörner, we see simulation as a powerful tool to “observe and record the background of planning, decision making, and evaluation processes that are usually hidden.” (pp. 9-10)

However, this book does not cover the entire scope of our interests. Dörner is a psychologist interested in individuals, group behavior is beyond his range. He alludes to normalization of deviance but his references appear limited to the flaunting of safety rules rather than a more pervasive process of slippage. More importantly, he does not address behavior that arises from the system itself, in particular adaptive behavior as an open system reacts to and interacts with its environment.

From our view, Dörner's suggestions may help the individual decision maker avoid common pitfalls and achieve locally optimum answers. On the downside, following Dörner's prescription might lead the decision maker to an unjustified confidence in his overall system management abilities. In a truly complex system, no one knows how the entire assemblage works. It's sobering to note that even in Dörner's closed,** relatively simple models many test subjects still had a hard time developing a reasonable mental model, and some failed completely.

This book is easy to read and Dörner's insights into the psychological traps that limit human decision making effectiveness remain useful.


* D. Dörner, The Logic of Failure: Recognizing and Avoiding Error in Complex Situations, trans. R. and R. Kimber (Reading, MA: Perseus Books, 1998). Originally published in German in 1989.

** One simulation model had an external input.