Wednesday, October 9, 2019

More on Mental Models in Healthcare

Source: Clipart Panda
Our August 6, 2019 post discussed the appalling incidence of preventable harm in healthcare settings.  We suggested that a better mental model of healthcare delivery could contribute to reducing the incidence of preventable harm.  It will come as no surprise to Safetymatters readers that we are referring to a systems-oriented model.

We’ll use a 2014 article* by Nancy Leveson and Sidney Dekker to describe how a systems approach can lead to better understanding of why accidents and other negative outcomes occur.  The authors begin by noting that 70-90% of industrial accidents are blamed on individual workers.**  As a consequence, proposed fixes focus on disciplining, firing, or retraining individuals or, for groups, specifying their work practices in ever greater detail (the authors call this “rigidifying” work).  This is the Safety I mental model in a nutshell, limiting its view to the “what” and “who” of incidents.   

In contrast, systems thinking posits the behavior of individuals can only be understood by examining the context in which their behavior occurs.  The context includes management decision-making and priorities, regulatory requirements and deficiencies, and of course, organizational culture, especially safety culture.  Fixes that don’t consider the overall process almost guarantee that similar problems will arise in the future.  “. . . human error is a symptom of a system that needs to be redesigned.”  Systems thinking adds the “why” to incident analysis.

Every system has a designer, although they may not be identified as such and may not even be aware they’re “designing” when they specify work steps or flows, or define support processes, e.g., procurement or quality control.  Importantly, designers deal with an ideal system, not with the actual constructed system.  The actual system may differ from the designer's original specification because of inherent process variances, the need to address unforeseen conditions, or evolution over time.  Official procedures may be incomplete, e.g., missing unlikely but possible conditions or assume that certain conditions cannot occur.  However, the people doing the work must deal with the  constructed system, however imperfect, and the conditions that actually occur.

The official procedures present a doubled-edged threat to employees.  If they adapt procedures in the face of unanticipated conditions, and the adaptation turns out to be ineffective or leads to negative outcomes, employees can be blamed for not following the procedures.  On the other hand, if they stick to the procedures when conditions suggest they should be adapted and negative outcomes occur, the employees can be blamed for too rigidly following them.

Personal blame is a major problem in System I.  “Blame is the enemy of safety . . . it creates a culture where people are afraid to report mistakes . . . A safety culture that focuses on blame will never be very effective in preventing accidents.”

Our Perspective

How does the above relate to reducing preventable harm in healthcare?  We believe that structural and cultural factors impede the application of systems thinking in the healthcare field.  It keeps them stuck in a Safety I worldview no matter how much they pretend otherwise. 

The hospital as formal bureaucracy

When we say “healthcare” we are referring to a large organization that provides medical care, a hospital is the smallest unit of analysis.  A hospital is literally a textbook example of what organizational theorists call a formal bureaucracy.  It has specialized departments with an official division of authority among them—silos are deliberately created and maintained.  An administrative hierarchy mediates among the silos and attempts to guide them toward overall goals. The organization is deliberately impersonal to avoid favoritism and behavior is prescribed, proscribed and guided by formal rules and procedures.  It appears hospitals were deliberately designed to promote System I thinking and its inherent bias for blaming the individual for negative outcomes.

Employees have two major strategies for avoiding blame: strong occupational associations and plausible deniability. 

Powerful guilds and unions 


Medical personnel are protected by their silo and tribe.  Department heads defend their employees (and their turf) from outsiders.  The doctors effectively belong to a guild that jealously guards their professional authority; the nurses and other technical fields have their unions.  These unofficial and official organizations exist to protect their members and promote their interests.  They do not exist to protect patients although they certainly tout such interest when they are pushing for increased employee headcounts.  A key cultural value is members do not rat on other members of their tribe so problems may be observed but go unreported.

Hiding behind the procedures

In this environment, the actual primary goal is to conform to the rules, not to serve clients.  The safest course for the individual employee is to follow the rules and procedures, independent of the effect this may have on a patient.  The culture espouses a value of patient safety but what gets a higher value is plausible deniability, the ability to avoid personal responsibility, i.e., blame, by hiding behind the established practices and rules when negative outcomes occur.

An enabling environment 


The environment surrounding healthcare allows them to continue providing a level of service that literally kills patients.  Data opacity means it’s very difficult to get reliable information on patient outcomes.  Hospitals with high failure rates simply claim they are stuck with or choose to serve the sickest patients.  Weak malpractice laws are promoted by the doctors’ guild and maintained by the politicians they support.  Society in general is overly tolerant of bad medical outcomes.  Some families may make a fuss when a relative dies from inadequate care but settlements are paid, non-disclosure agreements are signed, and the enterprise moves on.

Bottom line: It will take powerful forces to get the healthcare industry to adopt true systems-oriented thinking and identify the real reasons why preventive harm occurs and what corrective actions could be effective.  Healthcare claims to promote evidence-based medicine; they need to add evidence-based harm reduction strategies.  Industry-wide adoption of the aviation industry’s confidential reporting system for errors would be a big step forward.    


*  N. Leveson and S. Dekker, “Get To The Root Of Accidents,” ChemicalProcessing.com (Feb 27, 2014).  Retrieved Oct. 7, 2019.  Leveson is an MIT professor and long-standing champion of systems thinking; Dekker has written extensively on Just Culture and Safety II concepts.  Click on their respective labels to pull up our other posts on their work.

**  The article is tailored for the process industry but the same thinking can be applied to service industries.

No comments:

Post a Comment

Thanks for your comment. We read them all. The moderator will publish comments that are related to our content.