Thursday, January 3, 2013

The ETTO Principle: Efficiency-Thoroughness Trade-Off by Erik Hollnagel

This book* was suggested by a regular blog visitor. Below we provide a summary of the book followed by our assessment of how it comports with our understanding of decision making, system dynamics and safety culture.

Hollnagel describes a general principle, the efficiency-thoroughness trade-off (ETTO), that he believes almost all decision makers use. ETTO means that people and organizations routinely make choices between being efficient and being thorough. For example, if demand for production is high, thoroughness (time and other resources spent on planning and implementing an activity) is reduced until production goals are met. Alternatively, if demand for safety is high, efficiency (resources spent on production) is reduced until safety goals are met. (pp. 15, 28) Greater thoroughness is associated with increased safety.

ETTO is used for many reasons, including resource limitations, the need to maintain resource reserves, and social and organizational pressure. (p. 17) In practice, people use shortcuts, heuristics and rationalizations to make their decision-making more efficient. At the individual level, there are many ETTO rules, e.g., “It will be checked later by someone else,” “It has been checked earlier by someone else,” and “It looks like a Y, so it probably is a Y.” At the organizational level, ETTO rules include negative reporting (where the absence of reporting implies that everything is OK), cost reduction imperatives (which increase efficiency at the cost of thoroughness), and double-binds (where the explicit policy is “safety first” but the implicit policy is “production takes precedence when goal conflicts arise”). The use of any of these rules can lead to a compromise of safety. (pp. 35-36, 38-39) As decision makers ETTO, individual and organizational performance varies. Most of the time, things work out all right but sometimes failures occur. 

How do failures occur? 

Failures can happen when people, going about their work activities in a normal manner, create a series of ETTOs that ultimately result in unacceptable performance. These situations are more likely to occur the more complex and closely coupled the work system is. The best example (greatly simplified in the following) is an accident victim who arrived at an ER just before shift change on a Friday night. Doctor A examined her, ordered a head scan and X-rays and communicated with the surgery, ICU and radiology residents and her relief, Doctor B; Doctor B transferred the patient to the ICU, with care to be provided by the ICU and surgery residents; these residents and other doctors and staff provided care over the weekend. The major error was that everyone thought somebody else would read the patient's X-rays and make the correct diagnosis or, in the case of radiology doctors, did not carefully review the X-rays. On Monday, the rad tech who had taken the X-rays on Friday (and noticed an injury) asked the orthopedics resident about the patient; this resident had not heard of the case. Subsequent examination revealed that the patient had, along with her other injuries, a dislocated hip. (pp. 110-113) The book is populated with many other examples. 

Relation to other theorists 

Hollnagel refers to sociologist Charles Perrow, who believes some errors or accidents are unavoidable in complex, closely-coupled socio-technical organizations.** While Perrow used the term “interactiveness” (familiar vs unfamiliar) to grade complexity, Hollnagel updates it with “tractability” (knowable vs unknowable) to reflect his belief that in contemporary complex socio-technical systems, some of the relationships among internal variables and between variables and outputs are not simply “not yet specified” but “not specifiable.”

Both Hollnagel and Sydney Dekker identify with a type of organizational analysis called Resilience Engineering, which believes complex organizations must be designed to safely adapt to environmental pressure and recover from inevitable performance excursions outside the zone of tolerance. Both authors reject the linear, deconstructionist approach of fault-finding after incidents or accidents, the search for human error or the broken part. 

Assessment 

Hollnagel is a psychologist so he starts with the individual and then extends the ETTO principle to consider group or organizational behavior, finally extending it to the complex socio-technical system. He notes that such a system interacts with, attempts to control, and adapts to its environment, ETTOing all the while. System evolution is a strength but also makes the system more intractable, i.e., less knowable, and more likely to experience unpredictable performance variations. He builds on Perrow in this area but neither is a systems guy and, quite frankly, I'm not convinced either understands how complex systems actually work.

I feel ambivalence toward Hollnagel's thesis. Has he provided a new insight into decision making as practiced by real people, or has he merely updated terminology from earlier work (most notably, Herbert Simon's “satisficing”) that revealed that the “rational man” of classical economic theory really doesn't exist? At best, Hollnagel has given a name to a practice we've all seen and used and that is of some value in itself.

It's clear ETTO (or something else) can lead to failures in a professional bureaucracy, such as a hospital. ETTO is probably less obvious in a nuclear operating organization where “work to the procedure” is the rule and if a work procedure is wrong, then there's an administrative procedure to correct the work procedure. Work coordination and hand-offs between departments exhibit at least nominal thoroughness. But there is still plenty of room for decision-making short cuts, e.g., biases based on individual experience, group think and, yes, culture. Does a strong nuclear safety culture allow or tolerate ETTO? Of course. Otherwise, work, especially managerial or professional work, would not get done. But a strong safety culture paints brighter, tighter lines around performance expectations so decision makers are more likely to be aware when their expedient approaches may be using up safety margin.

Finally, Hollnagel's writing occasionally uses strained logic to “prove” specific points, the book needs a better copy editor, and my deepest suspicion is he is really a peripatetic academic trying to build a career on a relatively shallow intellectual construct.


* E. Hollnagel, The ETTO Principle: Efficiency-Thoroughness Trade-Off (Burlington, VT: Ashgate, 2009).

** C. Perrow, Normal Accidents: Living with High-Risk Technologies (New York: Basic Books, 1984).

5 comments:

  1. OK, I'm 0 for 3 on favorite theorists. Who do you suggest I read on decision making?

    ReplyDelete
  2. You're not 0 for 3, all these writers have something to offer to the soup of knowledge; it's your responsibility to identify the bits that make sense and integrate them into your worldview. You might check out Henry Mintzberg, a more traditional management theorist, who talks about the art, craft and science of decision making. He has an overview video on YouTube (http://www.youtube.com/watch?v=DyvXu3lSSG0). For more info, look for his paper "Decision Making: It’s Not What You Think."

    ReplyDelete
  3. Lew,

    I haven't read Henry Mintzberg since the 90's then he made a vivid and positive impression(trust those Canadians to be down to Earth).

    As for Hollnagel (and his friends Woods and Leveson: cf. "Joint Cognitive Systems" (2005) and Resilience Engineering (2006)) I think your comment on "peripatetic academic" may be a little harsh but not entirely faulty.

    My difficulty in these Hollnagel offerings is that we have a psychologist (presumably knowledgeable about the neural network learning character of mammal brains) trying to make peace with the typical peripatetic engineer's (meaning PhD's in academic settings and in professional society leadership)demand for linear-sequential simplicity (i.e. Dekker's Newtonian/Cartesian analytic paradigm).

    Hollnagel should know that the marvelous linear reductionism of the slide-rule is useful only in the least complex decision-making challenges. It seems unlikely he's read Drift into Failure.

    The more I think about Hollnagel's stance, the more I'm aware of a paradox; by any neuroscience basis, decision-making is a process on-going continually in every mobile sentient species from slime-molds to Superbowl winners - why does it get so much attention?

    What is unique with humans is the added capacity we have for Sense-Making. Its character as imperfect process seems, from the complexity perspective, to be an integral feature. Without the diversity of views and the contingent (i.e. provisional) character Sense-Making we'd seeming lack self-awareness and capacity for directed agency.

    One way of describing the Slide-Rule Simplification Theorem of the Engineering profession is that it provides the basis for a presumption: that well-validated standards, found in books and gained by training and experience, obviate the need to spend much time on Sense-Making.

    Up to a some degree of complication the progression: recognizable problem-statement, to handbook, to solution, does allow the practitioner to focus on outcomes. We can see that minimal original thought need be given to the "as found" condition. And most of engineering practice would seem to be safely conducted within these boundary conditions.

    But the TMI's, Chernobyl's, Challenger's, Columbia's, Deepwater Horizon's and Fukushima's demonstrate that the Slide Rule Simplification Theorem is not fully scalable over the range of circumstances we humans routinely encounter. From that realization, the very notion of "resilience engineering" is painfully inept.

    Acquiring added resilience through scenario modeling and response rehearsal I readily accept. The notion, that the practice of engineering governs such scenario building is a false as it is ubiquitous. It is neurologically nonsense.

    This observation is not of course to underestimate the breadth and virulence of assertions to the contrary. But I think that John Kay's observation on the first page of "Obliquity" should be a more widely sobering one:

    "The answer I realized was that our customers didn't really use [our] models for their decision making either. They used them internally or externally to justify decisions that they had already made."

    I would make one quibble with Kay - using models as an aid to Sense-Making does support understanding about what our nefarious mind had settled upon as "my answer." Even when we "decided with confidence" there is typically lots of detail work to be done. And only in the nuclear industry is such detail scripted to the Nth degree!

    Still, if we were to focus more on Simon's "satisficing" as the measure of enough sense-making in advance of testing our hypothesis, we might be less hung up about trying to standardize "decision-making" to conform with the Slide-Rule Simplification Theorem.

    ReplyDelete
  4. Let me weigh in as a practitioner. First, the quote from Kay reflects my experience of twelve years as "the human performance guy", trying to get managers to understand their role and adjust their own behaviors to counter drift away from safety. I had only limited success - a few maintenance managers and one engineering boss - as the reflexive response to any new data, theory, technique, role-play, etc. was to try and apply the 'learning' to other, not-manager people. Second, regarding Hollnagel and the 'bits that make sense', it makes compelling sense to me that one probably can identify a lexicon of typical responses to the pulsing, unruly work environment. Hollnagel calls it "local optimization", and it is as good as any map that I have seen of the ground-level catalog of responses to mismatches among goals and resouces within the context of the leader-constructed culture (Schein's underlying assumptions). Recall the mapping of the ETTO model on the well-documented Herald of Free Enterprise event. Knowledge of these 'typical' behavioral responses, plus grounding in the effects of a negative reporting culture I found to be like a waterhole in the desert of practical role modeling for nuclear managers. Hollnagel's suggestion to apply thoroughness mandates at key points in processes as a remedy is another gift. Extending the model to organizational behavior is more tenuous, granted. But as a short list of traits for senior leaders to avoid, his ain't bad. Finally, I personally don't see a conflict with Dekker, especially with his evocation of control theory. One might serve to actualize the other at the line level. I humbly subscribe to the idea that there is often something to be gleaned from a theorist who has not found the Sorcerer's Stone, and time will spent to take it up and use it.

    ReplyDelete
  5. Sounds alot like satisficing to me.

    ReplyDelete

Thanks for your comment. We read them all. We would like to display them under their respective posts on our main page but that is not how Blogger works.