Kahneman is a Nobel Prize winner in economics. His focus is on personal decision making, especially the biases and heuristics used by the unconscious mind as it forms intuitive opinions. Biases lead to regular (systematic) errors in decision making. Kahneman and Amos Tversky developed prospect theory, a model of choice, that helps explain why real people make decisions that are different from those of the rational man of economics.
Kahneman is a psychologist so his work focuses on the individual; many of his observations are not immediately linkable to safety culture (a group characteristic). But even in a nominal group setting, individuals are often very important. Think about the lawyers, inspectors, consultants and corporate types who show up after a plant incident. What kind of biases do they bring to the table when they are evaluating your organization's performance leading up to the incident?
The book* has five parts, described below. Kahneman reports on his own research and then adds the work of many other scholars. Many of the experiments appear quite simple but provide insights into unconscious and conscious decision making. There is a lot of content so this is a high level summary, punctuated by explicative or simply humorous quotes.
Part 1 describes two methods we use to make decisions: System 1 and System 2. System 1 is impulsive, intuitive, fast and often unconscious; System 2 is more analytic, cautious, slow and controlled. (p. 48) We often defer to System 1 because of its ease of use; we simply don't have the time, energy or desire to pore over every decision facing us. Lack of desire is another term for lazy.
System 1 often operates below consciousness, utilizing associative memory to link a current stimulus to ideas or concepts stored in memory. (p. 51) System 1's impressions become beliefs when accepted by System 2 and a mental model of the world takes shape. System 1 forms impressions of familiarity and rapid, precise intuitions then passes them on to System 2 to accept/reject. (pp. 58-62)
System 2 activities take effort and require attention, which is a finite resource. If we exceed the attention budget or become distracted then System 2 will fail to obtain correct answers. System 2 is also responsible for self-control of thoughts and behaviors, another drain on mental resources. (pp. 41-42)
Biases include a readiness to infer causality, even where none exists; a willingness to believe and confirm in the absence of solid evidence; succumbing to the halo effect where we project a coherent whole based on an initial impression; and problems caused by WYSIATI** including basing conclusions on limited evidence, overconfidence, framing effects where decisions differ depending on how information and questions are presented and base-rate neglect where we ignore widely-known data about a decision situation. (pp. 76-88)
Heuristics include substituting easier questions for the more difficult ones that have been asked, letting current mood affect answers on general happiness and allowing emotions to trump facts. (pp. 97-103)
Part 2 explores decision heuristics in greater detail, with research and examples of how we think associatively, metaphorically and causally. A major topic throughout this section is the errors people tend to make when handling questions that have a statistical dimension. Such errors occur because statistics requires us to think of many things at once, which System 1 is not designed to do, and a lazy or busy System 2, which could handle this analysis, is prone to accept System 1's proposed answer. Other errors occur because:
We make incorrect inferences from small samples and are prone to ascribe causality to chance events. “We are far too willing the reject the belief that much of what we in life is random.” (p. 117) We are prone to attach “a causal interpretation to the inevitable fluctuations of a random process.” (p. 176) “There is more luck in the outcomes of small samples.” (p. 194)
We fall for the anchoring effect, where we see a particular value for an unknown quantity (e.g., the asking price for a used car) before we develop our own value. Even random anchors, which provide no relevant information, can influence decision making.
People search for relevant information when asked questions. Information availability and ease of retrieval is a System 1 heuristic but only System 2 can judge the quality and relevance of retrieved content. People are more strongly affected by ease of retrieval and go with their intuition when they are, for example, mentally busy or in a good mood. (p. 135) However, “intuitive predictions tend to be overconfident and overly extreme.” (p. 192)
Unless we know the subject matter well, and have some statistical training, we have difficulty dealing with situations that require statistical reasoning. One research finding “illustrates a basic limitation in the ability of our mind to deal with small risks: we either ignore them altogether or give them far too much weight—nothing in between.” (p. 143) “There is one thing you can do when you have doubts about the quality of the evidence: let your judgments of probability stay close to the base rate.” (p. 153) “. . . whenever the correlation between two scores is imperfect, there will be regression to the mean. . . . [a process that] has an explanation but does not have a cause.” (pp. 181-82)
Finally, and the PC folks may not appreciate this, but “neglecting valid stereotypes inevitably results in suboptimal judgments.” (p. 169)
Part 3 focuses on specific shortcomings of our thought processes: overconfidence, fed by the illusory certainty of hindsight, in what we think we know, and underappreciation of the role of chance in events.
“Subjective confidence in a judgment is not a reasoned evaluation of the probability that this judgment is correct. Confidence is a feeling.” (p. 212) Hindsight bias “leads observers to assess the quality of a decision not by whether the process was sound but by whether its outcome was good or bad. . . . a clear outcome bias.” (p. 203) “. . . the optimistic bias may well be the most significant of the cognitive biases.” (p. 255) “The optimistic style involves taking credit for success but little blame for failure.” (p. 263)
“The sense-making machinery of System 1 makes us see the world as more tidy, predictable, and coherent than it really is.” (p. 204) “. . . reality emerges from the interactions of many different agents and force, including blind luck, often producing large and unpredictable results.” (p. 220) “An unbiased appreciation of uncertainty is a cornerstone of rationality—but it is not what people and organizations want. . . . Acting on pretended knowledge is often the preferred solution.” (p. 263)
And the best quote in the book: “Professional controversies bring out the worst in academics.” (p. 234)
Part 4 contrasts the rational people of economics with the more complex people of psychology, in other words, the Econs vs. the Humans. Kahneman shows how prospect theory opened a door between the two disciplines and contributed to the start of the field of behavioral economics.
Economists adopted expected utility theory to prescribe how decisions should be made and describe how Econs make choices. In contrast, prospect theory has three cognitive features: evaluation of choices is relative to a reference point, outcomes above that point are gains, below that point are losses; diminishing sensitivity to changes; and loss aversion, where losses loom larger than gains. (p. 282) In practice, loss aversion leads to risk-averse choices when both gains and losses are possible, and diminishing sensitivity leads to risk taking when sure losses are compared to a possible larger loss. “Decision makers tend to prefer the sure thing over the gamble (they are risk averse) when the outcomes are good. They tend to reject the sure thing and accept the gamble (they are risk seeking) when both outcomes are negative.” (p. 368)
“The fundamental ideas of prospect theory are that reference points exist, and that losses loom larger than corresponding gains.” (p. 297) “A reference point is sometimes the status quo, but it can also be a goal in the future; not achieving the goal is a loss, exceeding the goal is a gain.” (p. 303) Loss aversion is a powerful conservative force.” (p. 305)
When people do consider vary rare events, e.g., a nuclear accident, they will almost certainly overweight the probability in their decision making. “ . . . people are almost completely insensitive to variations of risk among small probabilities.” (p. 316) “. . . low-probability events are much more heavily weighted when described in terms of relative frequencies (how many) than when stated is more abstract terms of . . . “probability” (how likely).” (p. 329) Framing of questions evoke emotions, e.g., “losses evokes stronger negative feelings than costs.” (p. 364) But “[r]eframing is effortful and System 2 is normally lazy.” (p. 367) As an exercise, think about how anti-nuclear activists and NEI would frame the same question about the probability and consequences of a major nuclear accident.
There are some things an organization can do to improve its decision making. It can use local centers of over optimism (Sales dept.) and loss aversion (Finance dept.) to offset each other. In addition, an organization's decision making practices can require the use an outside view (i.e., a look at the probabilities of similar events in the larger world) and a formal risk policy to mitigate against known decision biases. (p. 340)
Part 5 covers two different selves that exist in every human, the experiencing self and the remembering self. The former lives through an experience and the latter creates a memory of it (for possible later recovery) using specific heuristics. Our tendency to remember events as a sample or summary of actual experience is a factor that biases current and future decisions. We end up favoring (fearing) a short period of intense joy (pain) over a long period of moderate happiness (pain). (p. 409)
Our memory has evolved to represent past events in terms of peak pain/pleasure during the events and our feelings when the event is over. Event duration does not impact our ultimate memory of an event. For example, we choose future vacations based on our final evaluations of past vacations even if many of our experiences during the past vacations were poor. (p. 389)
In a possibly more significant area, the life satisfaction score you assign to yourself is based on a small sample of highly available ideas or memories. (p. 400) Ponder that the next time you take or review responses from a safety culture survey.
Our Perspective
This is an important book. Although not explicitly stated, the great explanatory themes of cause (mechanical), choice (intentional) and chance (statistical) run through it. It is filled with nuggets that apply to the individual (psychological) and also the aggregate if the group shares similar beliefs. Many System 1 characteristics, if unchecked and shared by a group, have cultural implications.***
We have discussed Kahneman's work before on this blog, e.g., his view that an organization is a factory for producing decisions and his suggestion to use a “premortem” as a partial antidote for overconfidence. (A premortem is an exercise the group undertakes before committing to an important decision: Imagine being a year into the future, the decision's outcome is a disaster. What happened?) For more on these points, see our Nov. 4, 2011 post.
We have also discussed some of the topics he raises, e.g., the hindsight bias. Hindsight is 20/20 and it supposedly shows what decision makers could (and should) have known and done instead of their actual decisions that led to an unfavorable outcome, incident, accident or worse. We now know that when the past was the present, things may not have been so clear-cut.
Kahneman's observation that the ability to control attention predicts on-the-job performance (p. 37) is certainly consistent with our reports on the characteristics of high reliability organizations (HROs).
“The premise of this book is that it is easier to recognize other people's mistakes than our own.” (p. 28) Having observers at important, stressful decision making meetings is useful; they are less cognitively involved than the main actors and more likely to see any problems in the answers being proposed.
Critics' major knock on Kahneman's research is that it doesn't reflect real world conditions. His model is “overly concerned with failures and driven by artificial experiments than by the study of real people doing things that matter.” (p. 235) He takes this on by collaborating with a critic in an investigation of intuitive decision making, specifically seeking to answer: “When can you trust a self-confident professional who claims to have an intuition?” (p. 239) The answer is when the expert acquired skill in a predictable environment, and had sufficient practice with immediate, high-quality feedback. For example, anesthesiologists are in a good position to develop predictive expertise; on the other hand, psychotherapists are not, primarily because a lot of time and external events can pass between their prognosis for a patient and ultimate results. However, “System 1 takes over in emergencies . . .” (p. 35) Because people tend to do what they've been trained to do in emergencies, training leading to (correct) responses is vital.
Another problem is that most of Kahneman's research uses university students, both undergraduate and graduate, as subjects. It's fair to say professionals have more training and life experience, and have probably made some hasty decisions they later regretted and (maybe) learned from. On the other hand, we often see people who make sub-optimal, or just plain bad decisions even though they should know better.
There are lessons here for managers and other would-be culture shapers. System 1's search for answers is mostly constrained to information consistent with existing beliefs (p. 103) which is an entry point for culture. We have seen how group members can have their internal biases influenced by the dominant culture. But to the extent System 1 dominates employees' decision making, decision quality may suffer.
Not all appeals can be made to the rational man in System 2 even though a customary, if tacit, assumption of managers is they and their employees are rational and always operating consciously, thus new experiences will lead to expected new values and beliefs, new decisions and improved safety culture. But it may not be this straightforward. System 1 may intervene and managers should be alert to evidence of System 1 type thinking and adjust their interventions accordingly. Kahneman suggests encouraging “a culture in which people look out for one another as they approach minefields.” (p. 418)
We should note Systems 1 and 2 are constructs and “do not really exist in the brain or anywhere else.” (p. 415) System 1 is not Dr. Morbius' Id monster.**** System 1 can be trained to behave differently, but it is always ready to provide convenient answers for a lazy System 2.
The book is long, with small print, but the chapters are short so it's easy to invest 15-20 min. at a time. One has to be on constant alert for useful nuggets that can pop up anywhere—which I guess promotes reader mindfulness. It is better than Blink, which simply overwhelmed this reader with a cloudburst of data showing the informational value of thin slices and unintentionally over-promoted the value of intuition. (see pp. 235-36) And it is much deeper than The Power of Habit, which we reviewed last February.
(Common sense is nothing more than a deposit of prejudices laid down by the mind before you reach eighteen. Attributed to Albert Einstein)
* D. Kahneman, Thinking, Fast and Slow (New York: Farrar, Straus and Giroux, 2011).
** WYSIATI – What You See Is All There Is. Information that is not retrieved from memory, or otherwise ignored, may as well not exist. (pp. 85-88) WYSIATI means we base decisions on the limited information that we are able or willing to retrieve before a decision is due.
*** A few of these characteristics are mentioned in this report, e.g., impressions morphing into beliefs, a bias to believe and confirm, and WYSIATI errors. Others include links of cognitive ease to illusions of truth and reduced vigilance (complacency), and narrow framing where decision problems are isolated from one another. (p. 105)
**** Dr. Edward Morbius is a character in the 1956 sci-fi movie Forbidden Planet.
Wednesday, December 18, 2013
Thinking, Fast and Slow by Daniel Kahneman
Posted by
Lewis Conner
Labels:
Decision Making,
Kahneman,
References,
Statistics
Subscribe to:
Post Comments (Atom)
This Book will show you how we think and decide !!!
ReplyDeleteone of the best Books in market !!!
This very well written book will enlighten and entertain the reader !!!
author "Daniel Kahneman " considered one of the most important psychologists alive today .......also he has got Nobel in Economic Sciences !!
Personally I think this is a great book because it opens a new perspective. Many human behaviors can be explained well in the "System 1 vs. System 2" framework.'
The excellent style of writing in this book makes it a very interesting and easy read. It's brilliant insight into human thinking......
Thinking Fast and Slow is a very valuable book by one of the most creative minds in psychology. Highly recommended. For a more complete and critical understanding, I also recommend the writings of the critics of behavioral economic models such as Gerd Gigerenzer.
ReplyDelete