Most of us
associate psychologist James Reason with the “Swiss Cheese Model” of defense in
depth or possibly the notion of a “just culture.” But his career has been more than two ideas,
he has literally spent his professional life studying errors, their causes and
contexts. A Life In Error* is an academic memoir, recounting his study of
errors starting with the individual and ending up with the organization (the
“system”) including its safety culture (SC).
This post summarizes relevant portions of the book and provides our
perspective. It is going to read like a sub-titled
movie on fast-forward but there are a lot of particulars packed in this short
(124 pgs.) book.
Slips and Mistakes
People make plans and take action, consequences follow. Errors occur when the intended goals are not achieved. The plan may be adequate but the execution faulty because of slips (absent-mindedness) or trips (clumsy actions). A plan that was inadequate to begin with is a mistake which is usually more subtle than a slip, and may go undetected for long periods of time if no obviously bad consequences occur. (pp. 10-12) A mistake is a creation of higher-level mental activity than a slip. Both slips and mistakes can take “strong but wrong” forms, where schema** that were effective in prior situations are selected even though they are not appropriate in the current situation.
Absent-minded slips can occur from misapplied competence where a planned routine is sidetracked into an unplanned one. Such diversions can occur, for instance, when one’s attention is unexpectedly diverted just as one reaches a decision point and multiple schema are both available and actively vying to be chosen. (pp. 21-25) Reason’s recipe for absent-minded errors is one part cognitive under-specification, e.g., insufficient knowledge, and one part the existence of an inappropriate response primed by prior, recent use and the situational conditions. (p. 49)
Planning Biases
The planning activity is subject to multiple biases. An individual planner’s database may be incomplete or shaped by past experiences rather than future uncertainties, with greater emphasis on past successes than failures. Planners can underestimate the influence of chance, overweight data that is emotionally charged, be overly influenced by their theories, misinterpret sample data or miss covariations, suffer hindsight bias or be overconfident.*** Once a plan is prepared, planners may focus only on confirmatory data and are usually resistant to changing the plan. Planning in a group is subject to “groupthink” problems including overconfidence, rationalization, self-censorship and an illusion of unanimity. (pp. 56-62)
Errors and Violations
Violations are deliberate acts to break rules or procedures, although bad outcomes are not generally intended. Violations arise from various motivational factors including the organizational culture. Types of violations include corner-cutting to avoid clumsy procedures, necessary violations to get the job done because the procedures are unworkable, adjustments to satisfy conflicting goals and one-off actions (such as turning off a safety system) when faced with exceptional circumstances. Violators perform a type of cost:benefit analysis biased by the fact that benefits are likely immediate while costs, if they occur, are usually somewhere in the future. In Reason’s view, the proper course for the organization is to increase the perceived benefits of compliance not increase the costs (penalties) for violations. (There is a hint of the “just culture” here.)
Organizational Accidents
Major accidents (TMI, Chernobyl, Challenger) have three common characteristics: contributing factors that were latent in the system, multiple levels of defense, and an unforeseen combination of latent factors and active failures (errors and/or violations) that defeated the defenses. This is the well-known Swiss Cheese Model with the active failures opening short-lived holes and latent failures creating longer-lasting but unperceived holes.
Organizational accidents are low frequency, high severity events with causes that may date back years. In contrast, individual accidents are more frequent but have limited consequences; they arise from slips, trips and lapses. This is why organizations can have a good industrial accident record while they are on the road to a large-scale disaster, e.g., BP at Texas City.
Organizational Culture
Certain industries, including nuclear power, have defenses-in-depth distributed throughout the system but are vulnerable to something that is equally widespread. According to Reason, “The most likely candidate is safety culture. It can affect all elements in a system for good or ill.” (p. 81) An inadequate SC can undermine the Swiss Cheese Model: there will be more active failures at the “sharp end”; more latent conditions created and sustained by management actions and policies, e.g., poor maintenance, inadequate equipment or downgrading training; and the organization will be reluctant to deal proactively with known problems. (pp. 82-83)
Reason describes a “cluster of organizational pathologies” that make an adverse system event more likely: blaming sharp-end operators, denying the existence of systemic inadequacies, and a narrow pursuit of production and financial goals. He goes on to list some of the drivers of blame and denial. The list includes: accepting human error as the root cause of an event; the hindsight bias; evaluating prior decisions based on their outcomes; shooting the messenger; belief in a bad apple but not a bad barrel (the system); failure to learn; a climate of silence; workarounds that compensate for systemic inadequacies’ and normalization of deviance. (pp. 86-92) Whew!
Our Perspective
Reason teaches us that the essence of understanding errors is nuance. At one end of the spectrum, some errors are totally under the purview of the individual, at the other end they reside in the realm of the system. The biases and issues described by Reason are familiar to Safetymatters readers and echo in the work of Dekker, Hollnagel, Kahneman and others. We have been pounding the drum for a long time on the inadequacies of safety analyses that ignore systemic issues and corrective actions that are limited to individuals (e.g., more training and oversight, improved procedures and clearer expectations).
The book is densely packed with the work of a career. One could easily use the contents to develop a SC assessment or self-assessment. We did not report on the chapters covering research into absent-mindedness, Freud and medical errors (Reason’s current interest) but they are certainly worth reading.
Reason says this book is his valedictory: “I have nothing new to say and I’m well past my prime.” (p. 122) We hope not.
* J. Reason, A Life In Error: From Little Slips to Big Disasters (Burlington, VT: Ashgate, 2013).
** Knowledge structures in long-term memory. (p. 24)
*** This will ring familiar to readers of Daniel Kahneman. See our Dec. 18, 2013 post on Kahneman’s Thinking, Fast and Slow.
Slips and Mistakes
People make plans and take action, consequences follow. Errors occur when the intended goals are not achieved. The plan may be adequate but the execution faulty because of slips (absent-mindedness) or trips (clumsy actions). A plan that was inadequate to begin with is a mistake which is usually more subtle than a slip, and may go undetected for long periods of time if no obviously bad consequences occur. (pp. 10-12) A mistake is a creation of higher-level mental activity than a slip. Both slips and mistakes can take “strong but wrong” forms, where schema** that were effective in prior situations are selected even though they are not appropriate in the current situation.
Absent-minded slips can occur from misapplied competence where a planned routine is sidetracked into an unplanned one. Such diversions can occur, for instance, when one’s attention is unexpectedly diverted just as one reaches a decision point and multiple schema are both available and actively vying to be chosen. (pp. 21-25) Reason’s recipe for absent-minded errors is one part cognitive under-specification, e.g., insufficient knowledge, and one part the existence of an inappropriate response primed by prior, recent use and the situational conditions. (p. 49)
Planning Biases
The planning activity is subject to multiple biases. An individual planner’s database may be incomplete or shaped by past experiences rather than future uncertainties, with greater emphasis on past successes than failures. Planners can underestimate the influence of chance, overweight data that is emotionally charged, be overly influenced by their theories, misinterpret sample data or miss covariations, suffer hindsight bias or be overconfident.*** Once a plan is prepared, planners may focus only on confirmatory data and are usually resistant to changing the plan. Planning in a group is subject to “groupthink” problems including overconfidence, rationalization, self-censorship and an illusion of unanimity. (pp. 56-62)
Errors and Violations
Violations are deliberate acts to break rules or procedures, although bad outcomes are not generally intended. Violations arise from various motivational factors including the organizational culture. Types of violations include corner-cutting to avoid clumsy procedures, necessary violations to get the job done because the procedures are unworkable, adjustments to satisfy conflicting goals and one-off actions (such as turning off a safety system) when faced with exceptional circumstances. Violators perform a type of cost:benefit analysis biased by the fact that benefits are likely immediate while costs, if they occur, are usually somewhere in the future. In Reason’s view, the proper course for the organization is to increase the perceived benefits of compliance not increase the costs (penalties) for violations. (There is a hint of the “just culture” here.)
Organizational Accidents
Major accidents (TMI, Chernobyl, Challenger) have three common characteristics: contributing factors that were latent in the system, multiple levels of defense, and an unforeseen combination of latent factors and active failures (errors and/or violations) that defeated the defenses. This is the well-known Swiss Cheese Model with the active failures opening short-lived holes and latent failures creating longer-lasting but unperceived holes.
Organizational accidents are low frequency, high severity events with causes that may date back years. In contrast, individual accidents are more frequent but have limited consequences; they arise from slips, trips and lapses. This is why organizations can have a good industrial accident record while they are on the road to a large-scale disaster, e.g., BP at Texas City.
Organizational Culture
Certain industries, including nuclear power, have defenses-in-depth distributed throughout the system but are vulnerable to something that is equally widespread. According to Reason, “The most likely candidate is safety culture. It can affect all elements in a system for good or ill.” (p. 81) An inadequate SC can undermine the Swiss Cheese Model: there will be more active failures at the “sharp end”; more latent conditions created and sustained by management actions and policies, e.g., poor maintenance, inadequate equipment or downgrading training; and the organization will be reluctant to deal proactively with known problems. (pp. 82-83)
Reason describes a “cluster of organizational pathologies” that make an adverse system event more likely: blaming sharp-end operators, denying the existence of systemic inadequacies, and a narrow pursuit of production and financial goals. He goes on to list some of the drivers of blame and denial. The list includes: accepting human error as the root cause of an event; the hindsight bias; evaluating prior decisions based on their outcomes; shooting the messenger; belief in a bad apple but not a bad barrel (the system); failure to learn; a climate of silence; workarounds that compensate for systemic inadequacies’ and normalization of deviance. (pp. 86-92) Whew!
Our Perspective
Reason teaches us that the essence of understanding errors is nuance. At one end of the spectrum, some errors are totally under the purview of the individual, at the other end they reside in the realm of the system. The biases and issues described by Reason are familiar to Safetymatters readers and echo in the work of Dekker, Hollnagel, Kahneman and others. We have been pounding the drum for a long time on the inadequacies of safety analyses that ignore systemic issues and corrective actions that are limited to individuals (e.g., more training and oversight, improved procedures and clearer expectations).
The book is densely packed with the work of a career. One could easily use the contents to develop a SC assessment or self-assessment. We did not report on the chapters covering research into absent-mindedness, Freud and medical errors (Reason’s current interest) but they are certainly worth reading.
Reason says this book is his valedictory: “I have nothing new to say and I’m well past my prime.” (p. 122) We hope not.
* J. Reason, A Life In Error: From Little Slips to Big Disasters (Burlington, VT: Ashgate, 2013).
** Knowledge structures in long-term memory. (p. 24)
*** This will ring familiar to readers of Daniel Kahneman. See our Dec. 18, 2013 post on Kahneman’s Thinking, Fast and Slow.