Tuesday, August 25, 2020
How to Consider Unknown Unknowns: Hints from McKinsey
The authors begin by noting executives’ decision making processes often coalesce around “managing the probable,” i.e., attempting to fit a current decision into a model that has worked before. The questions they ask and the data they seek tend to narrow, not expand, the decision and its context. This is an efficient way to approach many everyday decisions but excessively simple models are not appropriate for complicated decisions like how to approach a changing market or define a market that does not yet exist. All models constrain the eventual solution and simple models constrain it the most, perhaps leading to a totally wrong answer.
Decision situations that are dramatically different, complex, and uncertain require a more open-ended approach, the authors call it “leading the possible.” In such situations, decision makers should acknowledge they don’t know how uncertain environmental conditions will unfold or complex systems will evolve. The authors propose three non-traditional mental habits to identify and explore the possibilities.
Ask different questions
Ask questions that open up possibilities rather than narrowing the discussion and constraining the solution. Sample questions include: What do I expect not to find? How could I adjust to the unexpected? What might I be discounting or explaining away too quickly? What would happen if I changed one or more of my core assumptions? We would add: Is fear of missing out prodding me to move too rashly or complacency allowing me to not move at all?
As Hans Rosling said: “Beware of simple ideas and simple solutions. . . . Welcome complexity.” (see our Dec. 3, 2018 post)
Take multiple perspectives
Decision makers, especially senior managers, need to escape the echo chamber of the sycophants who surround them. They should consider how people who are very different from themselves might view the same decision situation. They can consult people who are knowledgeable but frustrating or irritating, or outside their usual internal circle such as junior staff, or even dissatisfied customers. Such perspectives can be insightful and surprising.
Other thought leaders have suggested similar approaches. For example, Ray Dalio proposes thoughtful disagreement where decision makers seek out brilliant people who disagree with them to gain a deeper understanding of decision situations (see our April 17, 2018 post) or Charlan Nemeth on the usefulness of authentic dissent in decision situations (see our June 29, 2020 post).
Recognize systems
The authors’ appreciation for systems thinking mirrors what we’ve been saying for years. (For example, see our Jan. 6, 2017 post.) Decision makers should be looking at the evolution of the forest, not examining individual trees. We need to acknowledge and accept that “Elements in a system can be connected in ways that are not immediately apparent.” The widest view is the most powerful but people have “been trained to follow our natural inclination to examine the component parts. We assume a straightforward and linear connection between cause and effect. Finally, we look for root causes at the center of problems. In doing these things, we often fail to perceive the broader forces at work.”
The authors realize that leaders who can apply the new habits may have different attributes than earlier senior managers. Traditional leaders are clear, confident, and decisive. However, their preference for managing the probable leaves them more open to being blindsided. In contrast, new leaders need to exhibit “humility, a keen sense of their own limitations, an insatiable curiosity, and an orientation to learning and development.”
Our Perspective
This article promotes more expansive mental models for decision making in formal organizations, models that deemphasize reliance on reductionism and linear, cause-effect thinking. We applaud the authors’ intent.
McKinsey is pretty good at publishing small bite “news you can use” articles. However, they do not contain any of the secret sauce for which McKinsey charges its clients top dollar.
Bottom line: Some of you don’t want to read 300 page books on management so here’s an 8 page article with a few good points.
* Z. Achi and J.G. Berger, “Delighting in the Possible,” McKinsey Quarterly (March 2015).
Friday, July 31, 2020
Culture in Healthcare: Lessons from When We Do Harm by Danielle Ofri, MD
In her book*, Dr. Ofri takes a hard look at the prevalence of medical errors in the healthcare system. She reports some familiar statistics** and fixes, but also includes highly detailed case studies where errors large and small cascaded over time and the patients died. This post summarizes her main observations. She does not provide a tight summary of a less error-prone healthcare culture but she drops enough crumbs that we can infer its desirable attributes.
Healthcare is provided by a system
The system includes the providers, the supporting infrastructure, and factors in the external environment. Ofri observes that medical care is exceedingly complicated and some errors are inevitable. Because errors are inevitable, the system should emphasize error recognition and faster recovery with a goal of harm reduction.
She shares our view that the system permits errors to occur so fixes should focus on the system and not on the individual who made an error.*** System failures will eventually trap the most conscientious provider. She opines that most medical errors are the result of a cascade of actions that compound one another; we would say the system is tightly coupled.
System “improvements” intended to increase efficiency can actually reduce effectiveness. For example, electronic medical records can end up dictating providers’ practices, fragmenting thoughts and interfering with the flow of information between doctor and patient.**** Data field defaults and copy and paste shortcuts can create new kinds of errors. Diagnosis codes driven by insurance company billing requirements can distort the diagnostic process. In short, patient care becomes subservient to documentation.
Other changes can have unforeseen consequences. For example, scheduling fewer working hours for interns leads to fewer diagnostic and medication errors but also results in more patient handoffs (where half of adverse medical events are rooted.)
Aviation-inspired checklists have limited applicability
Checklists have reduced error rates for certain procedures but can lead to unintended consequences, e.g., mindless check-off of the items (to achieve 100% completion in the limited time available) and provider focus on the checklist while ignoring other things that are going on, including emergent issues.
Ofri thinks the parallels between healthcare and aviation are limited because of the complexity of human physiology. While checklists may be helpful for procedures, doctors ascribe limited value to process checklists that guide their thinking.
Malpractice suits do not meaningfully reduce the medical error rate
Doctors fear malpractice suits so they practice defensive medicine, prescribing extra tests and treatments which have their own risks of injury and false positives, and lead to extra cost. Medical equipment manufacturers also fear lawsuits so they design machines that sound alarms for all matters great and small; alarms are so numerous they are often simply ignored by the staff.
Hospital management culture is concerned about protecting the hospital’s financial interests against threats, including lawsuits. A Cone of Silence is dropped over anything that could be considered an error and no information is released to the public, including family members of the injured or dead patient. As a consequence, it is estimated that fewer than 10% of medical errors ever come to light. There is no national incident reporting system because of the resistance of providers, hospitals, and trial lawyers.
The reality is a malpractice suit is not practical in the vast majority of cases of possible medical error. The bar is very high: your doctor must have provided sub-standard care that caused your injury/death and resulted in quantifiable damages. Cases are very expensive and time-consuming to prepare and the legal system, like the medical system, is guided by money so an acceptable risk-reward ratio has to be there for the lawyers.*****
Desirable cultural attributes for reducing medical errors
In Ofri’s view, culture includes hierarchy, communications skill, training traditions, work ethic, egos, socialization, and professional ideals. The primary cultural attribute for reducing errors is a willingness of individuals to assume ownership and get the necessary things done amid a diffusion of responsibility. This must be taught by example and individuals must demand comparable behavior from their colleagues.
Providing medical care is a team business
Effective collaboration among team members is key, as is the ability (or duty even) of lower-status members to point out problems and errors without fear of retribution. Leaders must encourage criticism, forbid scapegoating, and not allow hierarchy and egos to overrule what is right and true. Where practical, training should be performed in groups who actually work together to build communication skills.
Doctors and nurses need time and space to think
Doctors need the time to develop differential diagnosis, to ask and answer “What else could it be?” The provider’s thought process is the source of most diagnostic error, and subject to explicit and implicit biases, emotions, and distraction. However, stopping to think can cause delays which can be reported as shortcomings by the tracking system. The culture must acknowledge uncertainty (fueled by false positives and negatives), address overconfidence, and promote feedback, especially from patients.
Errors and near misses need to be reported without liability or shame.
The culture should regard reporting an adverse event as a routine and ordinary task. This is a big lift for people steeped in the hierarchy of healthcare and the impunity of its highest ranked members. Another factor to be overcome is the reluctance of doctors to report errors because of their feelings of personal and professional shame.
Ofri speaks favorably of a “just culture” that recognizes that unintentional error is possible, but risky behavior like taking shortcuts requires (system) intervention, and negligence should be disciplined. In addition, there should not be any bias in how penalties are handed out, e.g., based on status.
In sum, Ofri says healthcare will always be an imperfect system. Ultimately, what patients want is acknowledgement of errors and apology for them from doctors.
Our Perspective
Ofri’s major contribution is her review of the evidence showing how pervasive medical errors are and how the healthcare industry works overtime to deny and avoid responsibility for them.
Her suggestions for a safer healthcare culture echo what we’ve been saying for years about the attributes of a strong safety culture. Reducing the error rates will be hard for many reasons. For example, Ofri observes medical training forges a lifelong personal identity and reverence for tradition; in our view, it also builds in resistance to change. The biases in decision making that she mentions are not trivial. For one discussion of such biases, see our Dec. 18, 2013 review of Daniel Kahneman’swork.
Bottom line: After you read this, you will be clutching your rosary a little tighter if you have to go to a hospital for a major injury or illness. You are more responsible for your own care than you think.
** For example, a study reporting that almost 4% of hospitalizations resulted in medical injury, of which 14% were fatal, and doctors’ diagnostic accuracy is estimated to be in the range of 90%.
*** It has been suggested that the term “error” be replaced with “adverse medical event” to reduce the implicit focus on individuals.
**** Ofri believes genuine conversation with a patient is the doctor’s single most important diagnostic tool.
***** As an example of the power of money, when Medicare started fining hospitals for shortcomings, the hospitals started cleaning up their problems.

