Friday, November 9, 2018

Nuclear Safety Culture: Lessons from Turn the Ship Around! by L. David Marquet

Turn the Ship Around!* was written by a U.S. Navy officer who was assigned to command a submarine with a poor performance history.  He adopted a management approach that was radically different from the traditional top-down, leader-follower, “I say, you do” Navy model for controlling people.  The new captain’s primary method was to push decision making down to the lowest practical organizational levels; he supported his crew’s new authorities (and maintained control of the overall situation) with strategies to increase their competence and provide clarity on the organization’s purpose and goals.

Specific management practices were implemented or enhanced to support the overall approach.  For example, decision making guidelines were developed and disseminated.  Attaining goals was stressed over unconsciously following procedures.  Crew members were instructed to “think out loud” before initiating action; this practice communicated intent and increased organizational resilience because it created opportunities for others to identify potential errors before they could occur and propagate.  Pre-job briefs were changed from the supervisor reciting the procedure to asking participants questions about their roles and preparation.

As a result, several organizational characteristics that we have long promoted became more evident, including deferring to expertise (getting the most informed, capable people involved with a decision), increased trust, and a shared mental model of vision, purpose and organizational functioning.

As you can surmise, his approach worked.  (If it hadn’t, Marquet would have had a foreshortened career and there would be no book.)  All significant operational and personnel metrics improved under his command.  His subordinates and other crew members became highly promotable.  Importantly, the boat’s performance continued at a high level after he completed his tour; in other words, he established a system for success that could live on without his personal involvement.

Our Perspective 


This book provides a sharp contrast to nuclear industry folklore that promotes strong, omniscient leadership as the answer to every problem situation.  Marquet did not act out the role of the lone hero, instead he built a management system that created superior performance while he was in command and after he moved on.  There can be valuable lessons here for nuclear managers but one has to appreciate the particular requirements for undertaking this type of approach.

The manager’s attitude

You have to be willing to share some (maybe a lot) of your authority with your subordinates, their subordinates and so forth on down the line while still being held to account by your bosses for your unit’s performance.  Not everyone can do this.  It requires faith in the new system and your people and a certain detachment from short-term concerns about your own career.  You also need to have sufficient self-awareness to learn from mistakes as you move forward and recognize when you are failing to walk the talk with your subordinates.

In Marquet’s case, there were two important precursors to his grand experiment.  First, he had seen on previous assignments how demoralizing top-down micromanagement could be vs. how liberating and motivating it was for him (as a subordinate officer) to actually be allowed to make decisions.  Second, he had been training for a year on how to command a sub of a design different from the boat to which he was eventually assigned; he couldn’t go in and micromanage everyone from the get-go, he didn’t have sufficient technical knowledge.

The work environment

Marquet had one tremendous advantage: from a social perspective, a submarine is largely a self-contained world.  He did not have to worry about what people in the department next door were doing; he only had to get his remote boss to go along with his plan.  If you’re a nuclear plant department head and you want to adopt this approach but the rest of the organization runs top-down, it may be rough sledding unless you do lots of prep work to educate your superiors and get them to support you, perhaps for a pilot or trial project.

The book is easy reading, with short chapters, lots of illustrative examples (including some interesting information on how the Navy and nuclear submarines work), sufficient how-to lists, and discussion questions at the end of chapters.  Marquet did not invent his approach or techniques out of thin air.  As an example, some of his ideas and prescriptions, including rejecting the traditional Navy top-down leadership model, setting clear goals, providing principles for guiding decision making, enforcing reflection after making mistakes, giving people tools and advantages but holding them accountable, and culling people who can’t get with the program** are similar to points in Ray Dalio’s Principles, which we reviewed on April 17, 2018.  This is not surprising.  Effective, self-aware leaders should share some common managerial insights.

Bottom line: Read this book to see a real-world example of how authentic employee empowerment can work.


*  L.D. Marquet, Turn the Ship Around! (New York: Penguin, 2012).  This book was recommended to us by a Safetymatters reader.  Please contact us if you have any material you would like us to review.

**  People have different levels of appetite for empowerment or other forms of participatory management.  Not everyone wants to be fully empowered, highly self-motivated or expected to show lots of initiative.  You may end up with employees who never buy into your new program and, in the worst case, you won’t be allowed to get rid of them.

Monday, October 29, 2018

Safety Culture: What are the Contributors to “Bad” Outcomes Versus “Good” Outcomes and Why Don’t Some Interventions Lead to Improved Safety Performance?

Why?
Sidney Dekker recently revisited* some interesting research he led at a large health care authority.  The authority’s track record was not atypical for health care: 1 out of 13 (7%) patients was hurt in the process of receiving care.  The authority investigated the problem cases and identified a familiar cluster of negative factors, including workarounds, shortcuts, violations, guidelines not followed, errors and miscalculations—the list goes on.  The interventions will also be familiar to you—identify who did what wrong, add more rules, try harder and get rid of bad apples—but were not reducing the adverse event rate.

Dekker’s team took a different perspective and looked at the 93% of patients who were not harmed.  What was going on in their cases?  To their surprise, the team found the same factors: workarounds, shortcuts, violations, guidelines not followed, errors and miscalculations, etc.** 

Dekker uses this research to highlight a key difference between the traditional view of safety management, Safety I, and the more contemporary view, Safety II.  At its heart, Safety I believes the source of problems lies with the individual so interventions focus on ways to make the individual’s work behavior more reliable, i.e., less likely to deviate from the idealized form specified by work designers.  Safety I ignores the fact that the same imperfections exist in work with both successful and problematic outcomes.

In contrast, Safety II sees the source of problems in the system, the dynamic combination of technology, environmental factors, organizational aspects, and individual cognition and choices.  Referencing the work of Diane Vaughan, Dekker says “the interior life of organizations is always messy, only partially well-coordinated and full of adaptations, nuances, sacrifices and work that is done in ways that is quite different from any idealized image of it.”

Revisiting the data revealed that the work with good outcomes was different.  This work had more positive characteristics, including diversity of professional opinion and the possibility to voice dissent, keeping the discussion on risk alive and not taking past success as a guarantee for safety, deference to proven expertise, widely held authority to say “stop,” and pride of workmanship.  As you know, these are important characteristics of a strong safety culture.

Our Perspective

Dekker’s essay is a good introduction to the differences between Safety I and Safety II thinking, most importantly their differing mental models of the way work is actually performed in organizations.  In Safety I, the root cause of imperfect results is the individual and constant efforts are necessary (e.g., training, monitoring, leadership, discipline) to create and maintain the individual’s compliance with work as designed.  In  Safety II, normal system functioning leads to mostly good and occasionally bad results.  The focus of Safety II interventions should be on activities that increase individual capacity to affect system performance and/or increase system robustness, i.e., error tolerance and an increased chance of recovery when errors occur.

If one applies Safety I thinking to a “bad” outcome then the most likely result from an effective intervention is that the exact same problem will not happen again.  This thinking sustains a robust cottage industry in root-cause analysis because new problems will always arise and no changes are made to the system itself.

We like Dekker’s (and Vaughan’s) work and have reported on it several times in Safetymatters (click on the Dekker and Vaughan labels to bring up related posts).  We have been emphasizing some of the same points, especially the need for a systems view, since we started Safetymatters almost ten years ago.

Individual Exercise: Again drawing on Vaughan, Dekker says “there is often no discernable difference between the organization that is about to have an accident or adverse event, and the one that won’t, or the one that just had one.”  Look around your organization and review your career experience; is that true?


*  S. Dekker, “Why Do Things Go Right?,” SafetyDifferently website (Sept. 28, 2018).  Retrieved Oct. 25, 2018.

**  This is actually rational.  People operate on feedback and if the shortcuts, workarounds and disregarding the guidelines did not lead to acceptable (or at least tolerable) results most of the time, folks would stop using them.