Showing posts with label Management. Show all posts
Showing posts with label Management. Show all posts

Friday, July 29, 2022

A Lesson from the Accounting Profession: Don’t Cheat on the Ethics Test

SEC Order

Accounting, like many professions, requires practitioners to regularly demonstrate competence and familiarity with relevant knowledge and practices.  One requirement for Certified Public Accountants (CPAs) is to take an on-line, multiple-choice test covering professional ethics.  Sounds easy but the passing grade is relatively high so it’s not a slam dunk.  Some Ernest & Young (EY) audit accountants found it was easier to pass if they cheated by using answer keys and sharing the keys with their colleagues.  They were eventually caught and got into big trouble with the U.S. Securities and Exchange Commission (SEC).  Following is a summary of the scandal as it evolved over time per the SEC order* and our view on what the incident says about EY’s culture.

During 2012-15, some EY employees were exploiting weaknesses in the company’s test software to pass tests despite not having a sufficient number of correct answers.  EY learned about this problem in 2014.  In 2016, EY learned that professionals in one office improperly shared answer keys.  EY repeatedly warned personnel that cheating on tests was a violation of the firm’s code of ethics but did not implement any additional controls to detect this misconduct.  The cheating continued into 2021.

In 2019 the SEC discovered cheating at another accounting firm and fined them $50 million.  As part of the SEC’s 2019 investigation, the agency asked EY if they had any problems with cheating.  In their response, EY said they had uncovered instances in the past but implied they had no current problems.  In fact, EY management had recently received a tip about cheating and initiated what turned out to be an extensive investigation that by late 2019 “confirmed that audit professionals in multiple offices cheated on CPA ethics exams.” (p. 6)  However, EY never updated their response to the SEC.  Eventually EY told the Public Company Accounting Oversight Board (PCAOB)** about the problems, and the PCAOB informed the SEC – 9 months after the SEC’s original request for information from EY.

In the U.S., the relationship between government regulators and regulated entities is based on the expectation that communications from the regulated entities will be complete, truthful, and updated on a timely basis if new information is discovered or developed.  Lying to or misleading the government, either through commission or omission, is a serious matter.

Because of EY’s violation of a PCAOB rule and EY’s misleading behavior with the SEC, the company was censured, fined $100 million, and required to implement a host of corrective actions, summarized below.

Review of Policies and Procedures

“EY shall evaluate . . . the sufficiency and adequacy of its quality controls, policies, and procedures relevant to ethics and integrity and to responding to Information Requests” (p. 9)  In particular, EY will evaluate “whether EY’s culture [emphasis added] is supportive of ethical and compliant conduct and maintaining integrity, including strong, explicit, and visible support and commitment by the firm’s management” (p. 10)

Independent Review of EY’s Policies and Procedures

“EY shall require that the Policies and Procedures IC [Independent Consultant] conduct a review of EY’s Policies and Procedures to determine whether they are designed and being implemented in a manner that provides reasonable assurance of compliance with all professional standards . . . . EY shall adopt, as soon as practicable, all recommendations of the Policies and Procedures IC in its report. . . . EY’s Principal Executive Officer must certify to the Commission staff in writing that (i) EY has adopted and has implemented or will implement all recommendations of the Policies and Procedures IC in its report . . .” (pp. 10-12)

Independent Review of EY’s Disclosure Failures

“EY’s Special Review Committee shall require that the Remedial IC conduct a review . . . of EY’s conduct relating to the Commission staff’s June 2019 Information Request, including whether any member of EY’s executive team, General Counsel’s Office, compliance staff, or other EY employees contributed to the firm’s failure to correct its misleading submission.” (p. 12)  Like the Policies and Procedures review, EY must adopt the recommendations in the Remedial IC Report and EY’s Principal Executive Officer must certify their adoption to the SEC.

Notice to Audit Clients, Training, and Certifications

“Within 10 business days after entry of this Order, EY shall provide all of its issuer audit clients and SEC-registered broker-dealer audit clients a copy of this Order. . . . all audit professionals and all EY partners and employees who, at any time prior to March 3, 2020, were aware (i) of the Division of Enforcement’s June 19, 2019 request, (ii) of EY’s June 20, 2019 response, and (iii) that an employee had made a tip on June 19, 2019 concerning cheating shall complete a minimum of 6 hours every 6 months of ethics and integrity training by an independent training provider . . . . EY’s Principal Executive Officer shall also certify that the training requirements . . . have been completed.” (pp. 14-15)

Our Perspective

A company’s culture includes the values and assumptions that underlie daily work life and influence decision making.  What can we infer about EY’s culture from the behavior described above?

First, what managers did after they discovered the cheating – issuing memos and waving their arms – did not work.  Even if EY terminated some employees, perhaps the worst offenders or maybe the least productive ones, EY did not make their testing process more robust or secure.

Second, senior leadership has not suffered from this scandal.  There is no indication any senior managers have been disciplined or terminated because of the misconduct.  The head of EY’s U.S. operations left at the end of her 4-year term, but her departure was apparently due to a disagreement with her boss, EY’s global chief executive. 

Third, there has been no apparent change in the employees’ task environments, e.g., their workload expectations and compensation program.

Conclusion: EY management tolerated the cheating because their more important priorities were elsewhere.  It’s safe to assume that EY, like other professional service firms, primarily values and rewards technical competence and maximizing billable hours.

We see two drivers for possible changes: the $100 million fine and the mandated review by “Independent Consultants.”  (EY’s self-review will likely be no more useful than their previous memos and posturing.)

What needs to be done? 

To begin, senior leadership has to say fixing the cheating problem is vitally important, and walk the talk by adjusting company practices to reinforce the task’s importance.  Leadership has to commit to a company corrective action program that recognizes, analyzes, and permanently fixes all significant company problems as they arise – not after their noses are rubbed into action by the regulator.  

In addition, there have to be visible changes in the audit professionals’ task environment.  The employees need to get work time, in the form of unbilled overhead hours, to prepare for tests.  The compensation scheme needs to add a component to recognize and reward ethical behavior – with clients and internally.  The administration of ethics tests needs to be made more secure, on a par with the accounting exams the employees take.

*  Securities and Exchange Commission, Other Release No.: 34-95167 Re: Ernst & Young LLP (June 28, 2022).  All quotes in our post are from the SEC order.  There is also an associated SEC press release.

**  The Public Company Accounting Oversight Board establishes auditing and professional practice standards for registered public accounting firms, such as EY, to follow in the preparation of audit reports for public companies.  PCAOB members are appointed by the SEC.

Friday, December 10, 2021

Prepping for Threats: Lessons from Risk: A User’s Guide by Gen. Stanley McChrystal.

Gen. McChrystal was a U.S. commander in Afghanistan; you may remember he was fired by President Obama for making, and allowing subordinates to make, disparaging comments about then-Vice President Biden.  However, McChrystal was widely respected as a soldier and leader, and his recent book* on strengthening an organization’s “risk immune system” caught our attention.  This post summarizes its key points, focusing on items relevant to formal civilian organizations.

McChrystal describes a system that can detect, assess, respond to, and learn from risks.**  His mental model consists of two major components: (1) ten Risk Control Factors, interrelated dimensions for dealing with risks and (2) eleven Solutions, strategies that can be used to identify and address weaknesses in the different factors.  His overall objective is to create a resilient organization that can successfully respond to challenges and threats. 

Risk Control Factors

These are things under the control of an organization and its leadership, including physical assets, processes, practices, policies, and culture.

Communication – The organization must have the physical ability and willingness to exchange clear, complete, and intelligible information, and identify and deal with propaganda or misinformation.

Narrative – An articulated organizational purpose and mission.  It describes Who we are, What we do, and Why we do it.  The narrative drives (and we’d say is informed by) values, beliefs, and action.

Structure – Organizational design defines decision spaces and communication networks, implies power (both actual and perceived authority), suggests responsibilities, and influences culture.

Technology – This is both the hardware/software and how the organization applies it.  It include an awareness of how much authority is being transferred to machines, our level of dependence on them, our vulnerability to interruptions, and the unintended consequences of new technologies.

Diversity – Leaders must actively leverage different perspectives and abilities, inoculate the organization against groupthink, i.e., norms of consensus, and encourage productive conflict and a norm of skepticism.  (See our June 29, 2020 post on A Culture that Supports Dissent: Lessons from In Defense of Troublemakers by Charlan Nemeth.)

Bias – Biases are assumptions about the world that affect our outlook and decision making, and cause us to ignore or discount many risks.  In McChrystal’s view “[B]ias is an invisible hand driven by self-interest.” (See our July 1, 2021 and Dec.18, 2013 posts on Daniel Kahneman’s work on identifying and handling biases.) 

Action – Leaders have to proactively overcome organizational inertia, i.e., a bias against starting something new or changing course.  Inertia manifests in organizational norms that favor the status quo and tolerate internal resistance to change.

Timing – Getting the “when” of action right.  Leaders have to initiate action at the right time with the right speed to yield optimum impact.

Adaptability – Organizations have to respond to changing risks and environments.  Leaders need to develop their organization’s willingness and ability to change.

Leadership – Leaders have to direct and inspire the overall system, and stimulate and coordinate the other Risk Control Factors.  Leaders must communicate the vision and personify the narrative.  In practice, they need to focus on asking the right questions and sense the context of a given situation, embracing the new before necessity is evident. (See our Nov. 9, 2018 post for an example of effective leadership.)


The Solutions are strategies or methods to identify weaknesses in and strengthen the risk control factors.  In McChrystal’s view, each Solution is particularly applicable to certain factors, as shown in Table 1.

Assumptions check – Assessment of the reasonableness and relative importance of assumptions that underlie decisions.  It’s the qualitative and quantitative analyses of strengths and weaknesses of supporting arguments, modified by the judgment of thoughtful people.

Risk review – Assessment of when hazards may arrive and the adequacy of the organization’s preparations.

Risk alignment check – Leaders should recognize that different perspectives on risks exist and should be considered in the overall response.

Gap analysis – Identify the space between current actions and desired goals.

Snap assessment – Short-term, limited scope analyses of immediate hazards.  What’s happening?  How well are we responding?

Communications check – Ensure processes and physical systems are in place and working.

Tabletop exercise – A limited duration simulation that tests specific aspects of the organization’s risk response.

War game (functional exercise) – A pressure test in real time to show how the organization comprehensively reacts to a competitor’s action or unforeseen event.

Red teaming – Exercises involving third parties to identify organizational vulnerabilities and blind spots.

Pre-mortem – A discussion focusing on the things mostly likely to go wrong during the execution of a plan. 

After-action review – A self-assessment that identifies things that went well and areas for improvement.


Table 1  Created by Safetymatters


Our Perspective

McChrystal did not invent any of his Risk Control Factors and we have discussed many of these topics over the years.***  His value-add is organizing them as a system and recognizing their interrelatedness.  The entire system has to perform to identify, prepare for, and respond to risks, i.e., threats that can jeopardize the organization’s mission success.

This review emphasizes McChrystal’s overall risk management model.  The book also includes many examples of risks confronted, ignored, or misunderstood in the military, government, and commercial arenas.  Some, like Blockbuster’s failure to acquire Netflix when it had the opportunity, had poor outcomes; others, like the Cuban missile crisis or Apollo 13, worked out better.

The book appears aimed at senior leaders but all managers from department heads on up can benefit from thinking more systematically about how their organizations respond to threats from, or changes in, the external environment. 

There are hundreds of endnotes to document the text but the references are more Psychology Today than the primary sources we favor.

Bottom line: This is an easy to read example of the “management cookbook” genre.  It has a lot of familiar information in one place.


*  S. McChrystal and A. Butrico, Risk: A User’s Guide (New York: Portfolio) 2021.  Butrico is McChrystal’s speechwriter.

**  Risk to McChrystal is a combination of a threat and one’s vulnerability to the threat.  Threats are usually external to the organization while vulnerabilities exist because of internal aspects.

***  For example, click on the Management or Decision Making labels to pull up posts in related areas.

Tuesday, February 2, 2021

Organizational Change and System Dynamics Insights from The Tipping Point by Malcolm Gladwell

The Tipping Point*
is a 2002 book by Malcolm Gladwell (who also wrote Blink) that uses the metaphor of a viral epidemic to explain how some phenomenon, e.g., a product**, an idea, or a social norm, can suddenly reach a critical mass and propagate rapidly through society.  Following is a summary of his key concepts.  Some of his ideas can inform strategies for implementing organizational change, especially cultural change, and reflect attributes of system dynamics that we have promoted on Safetymatters.

In brief, epidemics spread when they have the right sort of people to transmit the infectious agent, the agent itself has an attribute of stickiness, and the environment supports the agent and facilitates transmission. 


An epidemic thrives on three different types of people: people who connect with lots of other people, people who learn about a new product or idea and are driven to tell others, and persuasive people who sell the idea to others.  All these messengers drive contagiousness although all three types are not required for every kind of epidemic.


A virus needs to attach itself to a host; a new product promotion needs to be memorable, i.e., stick in people’s minds and spur them to action, for example Wendy’s “Where’s the beef?” campaign or the old “Winston tastes good . . .” jingle.  Information about the new product or idea needs to be packaged in a way that makes it attractive and difficult to resist.


General and specific environmental characteristics can encourage or discourage the spread of a phenomenon.  For a general example in the social environment consider the Broken Windows theory which holds that intolerance of the smallest infractions can lead to overall reductions in crime rates.

At the more specific level, humans possess a set of tendencies that can be affected by the particular circumstances of their immediate environment.  For example, we are more likely to comply with someone in a uniform (a doctor, say, or a police officer) than a scruffy person in jeans.  If people believe there are many witnesses to a crime, it’s less likely that anyone will try to stop or report the criminal activity; individual responsibility is diffused to the point of inaction.      

Our Perspective

We will expand some of Gladwell’s notions to emphasize how they can be applied to create organizational changes, including cultural change.  In addition, we’ll discuss how the dynamics he describes square with some aspects of system dynamics we have promoted on Safetymatters.

Organizational change

Small close-knit groups have the potential to magnify the epidemic potential of a message or idea.  “Close-knit” means people know each other well and even store information with specific individuals (the subject matter experts) to create a kind of overall group memory.  These bonds of memory and peer pressure can facilitate the movement of new ideas into and around the group, affecting the group’s shared mental models of everything from the immediate task environment to the larger outside world.  Many small movements can create larger movements that manifest as new or modified group norms.

In a product market, diffusion moves from innovators to early adopters to the majority and finally the laggards.  A similar model of diffusion can be applied in a formal organization.  Organizational managers trying to implement cultural changes should consider this diffusion model when they are deciding who to appoint to initiate, promote, and promulgate new or different cultural values or behaviors.  Ideally, they should start with well-connected, respected people who buy into the new attributes, can explain them to others, and influence others to try the new behaviors.

System dynamics

This whole book is about how intrusions can disrupt an existing social system, for good or bad, and result in epidemic, i.e., nonlinear effects.  This nonlinearity helps explain why systems can be operating more or less normally then suddenly veer into failure.  Active management deliberately tries to create such changes to veer into success.  Just think about how social media has upset the loci of power in our society: elected leaders and experts now have larger megaphones but so does the mob. 

That said, Gladwell presents a linear, cause-and-effect model for change.  He does not consider more complex system features such as feedback loops or deliberate attempts to modify, deflect, co-opt or counteract the novel input.  For example, a manager can try to establish new behaviors by creating a reinforcing loop of rewards and recognition in a small group, and then recreating it on an ever-larger scale.

Bottom line: This is easy reading with lots of interesting case studies and quotes from talking head PhDs.  The book comes across as a long magazine article. 


*  M Gladwell, The Tipping Point (New York: Back Bay Books/Little, Brown and Co.) 2000 and 2002.

**  “Product” is used in its broadest sense; it can mean something physical like a washing machine, a political campaign, a celebrity wannabe, etc.

Monday, April 1, 2019

Culture Insights from The Speed of Trust by Stephen M.R. Covey

In The Speed of Trust,* Stephen M.R. Covey posits that trust is the key competency that allows individuals (especially leaders), groups, organizations, and societies to work at optimum speed and cost.  In his view, “Leadership is getting results in a way that inspires trust.” (p. 40)  We saw the book mentioned in an NRC personnel development memo** and figured it was worth a look. 

Covey presents a model of trust made up of a framework, language to describe the framework’s components, and a set of recommended behaviors.  The framework consists of self trust, relationship trust and stakeholder trust.  Self trust is about building personal credibility; relationship trust is built on one’s behavior with others; and stakeholder trust is built within organizations, in markets (i.e., with customers), and over the larger society.  His model is not overly complicated but it has a lot of parts, as shown in the following figure.

Figure by Safetymatters

4 Cores of credibility 

Covey begins by describing how the individual can learn to trust him or herself.  This is basically an internal process of developing the 4 Cores of credibility: character attributes (integrity and intent) and competence attributes (capabilities and results).  Improvement in these areas increases self-confidence and one’s ability to project a trust-inspiring strength of character.  Integrity includes clarifying values and following them.  Intent includes a transparent, as opposed to hidden, agenda that drives one’s behavior.  Capabilities include the talents, skills, and knowledge, coupled with continuous improvement, that enable excellent performance.  Results, e.g., achieving goals and keeping commitments, are sine qua non for establishing and maintaining credibility and trust.

13 Behaviors  

The next step is learning how to trust and be trusted by others.  This is a social process, i.e., it is created through individual behavior and interaction with others.  Covey details 13 types of behavior to which the individual must attend.  Some types flow primarily, but not exclusively, from character, others from competence, and still others from a combination of the two.  He notes that “. . . the quickest way to decrease trust is to violate a behavior of character, while the quickest way to increase trust is to demonstrate a behavior of competence.” (p. 133)  Covey provides examples of each desired behavior, its opposite, and its “counterfeit” version, i.e., where people are espousing the desired behavior but actually avoiding doing it.  He describes the problems associated with underdoing and overdoing each behavior (an illustration of the Goldilocks Principle).  Behavioral change is possible if the individual has a compelling sense of purpose.  Each behavior type is guided by a set of principles, different for each behavior, as shown in the following figure.

Figure by Safetymatters

Organizational alignment

The third step is establishing trust throughout an organization.  The primary mechanism for accomplishing this is alignment of the organization’s visible symbols, underlying structures, and systems with the ideals expressed in the 4 Cores and 13 Behaviors, e.g., making and keeping commitments and accounting for results.  He describes the “taxes” associated with a low-trust organization and the “dividends” associated with a high-trust organization.  Beyond that, there is nothing new in this section.

Market and societal trust

We’ll briefly address the final topics.  Market trust is about an entity’s brand or reputation in the outside world.  Building a strong brand involves using the 4 Cores to establish, maintain or strengthen one’s reputation.  Societal trust is built on contribution, the value an entity creates in the world through ethical behavior, win-win business dealings, philanthropy and other forms of corporate social responsibility.     

Our Perspective 

Covey provides a comprehensive model of how trust is integral to relationships at every level of complexity, from the self to global relations.
The fundamental importance of trust is not new news.  We have long said organization-wide trust is vital to a strong safety culture.  Trust is a lubricant for organizational friction which, like physical friction, slows down activities, and makes them more expensive.  In our Safetysim*** management simulator, trust was an input variable that affected speed and effectiveness of problem resolution and overall cost performance. 

Covey’s treatment of culture is incomplete.  While he connects some of his behaviors or principles to organizational culture,**** he never actually defines culture.  It appears he thinks culture is something that “just is” or, perhaps, a consequence or artifact of performing the behaviors he prescribes.  It’s reasonable to assume Covey believes motivated individuals can behave their way to a better culture, saying “. . . behave your way into the person you want to be.” (pp. 87, 130)  His view is consistent with culture change theorists who believe people will eventually develop desired values if they model desired behavior long enough.  His recipe for cultural change boils down to “Just do it.”  We prefer a more explicit definition of culture, something along the spectrum from the straightforward notion of culture as an underlying set of values to the idea of culture as an emergent property of a complex socio-technical system. 

Trust is not the only candidate for the primary leadership or organizational competence.  The same or similar arguments could also be made about respect.  (Covey mentions respect but only as one of his 13 behaviors.)  Two-way respect is also essential for organizational success.  This leads to an interesting question: Could you respect a leader without trusting him/her?  How about some of the famous hard-ass bosses of management lore, like Harold Geneen?  Or General Patton? 

Covey is obviously a true believer in his message and his presentation has a fervor one normally associates with religious zeal.  He also includes many examples of family situations and describes how his prescriptions can be applied to families.  (Helpful if you want to manage your family like a little factory.)  Covey is a devout Mormon and his faith comes through in his writing. 

The book is an easy read.  Like many books written by successful consultants, it is interspersed with endorsements and quotes from business and political notables.  Covey includes a couple of useful self-assessment surveys.  He also offers a valuable observation: “. . . people tend to judge others based on behavior and judge themselves based on intent.” (p. 301)

Bottom line: This book is worth your time if lack of trust is a problem in your organization.

*  Stephen M. R. Covey, The Speed of Trust (New York: Free Press, 2016).  If the author’s name sounds familiar, it may be because his father, Stephen R. Covey, wrote The 7 Habits of Highly Effective People, a popular self-help book.

**  “Fiscal Year (FY) 2018 FEORP Plan Accomplishments and Successful/Promising Practices at the U.S. Nuclear Regulatory Commission (NRC),” Dec. 17, 2018.  ADAMS ML18351A243.  The agency uses The Speed of Trust concepts in manager and employee training. 

***  Safetysim is a management training simulation tool developed by Safetymatters’ Bob Cudlin.

****  For example, “A transparent culture of learning and growing will generally create credibility and trust, . . .” (p. 117)

Friday, November 9, 2018

Nuclear Safety Culture: Lessons from Turn the Ship Around! by L. David Marquet

Turn the Ship Around!* was written by a U.S. Navy officer who was assigned to command a submarine with a poor performance history.  He adopted a management approach that was radically different from the traditional top-down, leader-follower, “I say, you do” Navy model for controlling people.  The new captain’s primary method was to push decision making down to the lowest practical organizational levels; he supported his crew’s new authorities (and maintained control of the overall situation) with strategies to increase their competence and provide clarity on the organization’s purpose and goals.

Specific management practices were implemented or enhanced to support the overall approach.  For example, decision making guidelines were developed and disseminated.  Attaining goals was stressed over unconsciously following procedures.  Crew members were instructed to “think out loud” before initiating action; this practice communicated intent and increased organizational resilience because it created opportunities for others to identify potential errors before they could occur and propagate.  Pre-job briefs were changed from the supervisor reciting the procedure to asking participants questions about their roles and preparation.

As a result, several organizational characteristics that we have long promoted became more evident, including deferring to expertise (getting the most informed, capable people involved with a decision), increased trust, and a shared mental model of vision, purpose and organizational functioning.

As you can surmise, his approach worked.  (If it hadn’t, Marquet would have had a foreshortened career and there would be no book.)  All significant operational and personnel metrics improved under his command.  His subordinates and other crew members became highly promotable.  Importantly, the boat’s performance continued at a high level after he completed his tour; in other words, he established a system for success that could live on without his personal involvement.

Our Perspective 

This book provides a sharp contrast to nuclear industry folklore that promotes strong, omniscient leadership as the answer to every problem situation.  Marquet did not act out the role of the lone hero, instead he built a management system that created superior performance while he was in command and after he moved on.  There can be valuable lessons here for nuclear managers but one has to appreciate the particular requirements for undertaking this type of approach.

The manager’s attitude

You have to be willing to share some (maybe a lot) of your authority with your subordinates, their subordinates and so forth on down the line while still being held to account by your bosses for your unit’s performance.  Not everyone can do this.  It requires faith in the new system and your people and a certain detachment from short-term concerns about your own career.  You also need to have sufficient self-awareness to learn from mistakes as you move forward and recognize when you are failing to walk the talk with your subordinates.

In Marquet’s case, there were two important precursors to his grand experiment.  First, he had seen on previous assignments how demoralizing top-down micromanagement could be vs. how liberating and motivating it was for him (as a subordinate officer) to actually be allowed to make decisions.  Second, he had been training for a year on how to command a sub of a design different from the boat to which he was eventually assigned; he couldn’t go in and micromanage everyone from the get-go, he didn’t have sufficient technical knowledge.

The work environment

Marquet had one tremendous advantage: from a social perspective, a submarine is largely a self-contained world.  He did not have to worry about what people in the department next door were doing; he only had to get his remote boss to go along with his plan.  If you’re a nuclear plant department head and you want to adopt this approach but the rest of the organization runs top-down, it may be rough sledding unless you do lots of prep work to educate your superiors and get them to support you, perhaps for a pilot or trial project.

The book is easy reading, with short chapters, lots of illustrative examples (including some interesting information on how the Navy and nuclear submarines work), sufficient how-to lists, and discussion questions at the end of chapters.  Marquet did not invent his approach or techniques out of thin air.  As an example, some of his ideas and prescriptions, including rejecting the traditional Navy top-down leadership model, setting clear goals, providing principles for guiding decision making, enforcing reflection after making mistakes, giving people tools and advantages but holding them accountable, and culling people who can’t get with the program** are similar to points in Ray Dalio’s Principles, which we reviewed on April 17, 2018.  This is not surprising.  Effective, self-aware leaders should share some common managerial insights.

Bottom line: Read this book to see a real-world example of how authentic employee empowerment can work.

*  L.D. Marquet, Turn the Ship Around! (New York: Penguin, 2012).  This book was recommended to us by a Safetymatters reader.  Please contact us if you have any material you would like us to review.

**  People have different levels of appetite for empowerment or other forms of participatory management.  Not everyone wants to be fully empowered, highly self-motivated or expected to show lots of initiative.  You may end up with employees who never buy into your new program and, in the worst case, you won’t be allowed to get rid of them.

Tuesday, April 17, 2018

Nuclear Safety Culture: Insights from Principles by Ray Dalio

Book cover
Ray Dalio is the billionaire founder/builder of Bridgewater Associates, an investment management firm.  Principles* catalogs his policies, practices and lessons-learned for understanding reality and making decisions for achieving goals in that reality.  The book appears to cover every possible aspect of managerial and organizational behavior.  Our plan is to focus on two topics near and dear to us—decision making and culture—for ideas that could help strengthen nuclear safety culture (NSC).  We will then briefly summarize some of Dalio’s other thoughts on management.  Key concepts are shown in italics.

Decision Making

We’ll begin with Dalio’s mental model of reality.  Reality is a system of universal cause-effect relationships that repeat and evolve like a perpetual motion machine.  The system dynamic is driven by evolution (“the single greatest force in the universe” (p. 142)) which is the process of adaptation.

Because many situations repeat themselves, principles (policies or rules) advance the goal of making decisions in a systematic, repeatable way.  Any decision situation has two major steps: learning (obtaining and synthesizing data about the current situation) and deciding what to do.  Logic, reason and common sense are the primary decision making mechanisms, supported by applicable existing principles and tools, e.g., expected value calculations or evidence-based decision making tools.  The lessons learned from each decision situation can be incorporated into existing or new principles.  Practicing the principles develops good habits, i.e., automatic, reflexive behavior in the specified situations.  Ultimately, the principles can be converted into algorithms that can be computerized and used to support the human decision makers.

Believability weighting can be applied during the decision making process to obtain data or opinions about solutions.  Believable people can be anyone in the organization but are limited to those “who 1) have repeatedly and successfully accomplished the thing in question, and 2) . . . can logically explain the cause-effect relationships behind their conclusions.” (p. 371)  Believability weighting supplements and challenges responsible decision makers but does not overrule them.  Decision makers can also make use of thoughtful disagreement where they seek out brilliant people who disagree with them to gain a deeper understanding of decision situations.

The organization needs a process to get beyond disagreement.  After all discussion, the responsible party exercises his/her decision making authority.  Ultimately, those who disagree have to get on board (“get in sync”) and support the decision or leave the organization.

The two biggest barriers to good decision making are ego and blind spots.  Radical open-mindedness recognizes the search for what’s true and the best answer is more important than the need for any specific person, no matter their position in the organization, to be right.


Organizations and the individuals who populate them should also be viewed as machines.  Both are imperfect but capable of improving. The organization is a machine made up of culture and people that produces outcomes that provide feedback from which learning can occur.  Mistakes are natural but it is unacceptable to not learn from them.  Every problem is an opportunity to improve the machine.  

People are generally imperfect machines.  People are more emotional than logical.   They suffer from ego (subconscious drivers of thoughts) and blind spots (failure to see weaknesses in themselves).  They have different character attributes.  In short, people are all “wired” differently.  A strong culture with clear principles is needed to get and keep everyone in sync with each other and in pursuit of the organization’s goals.

Mutual adjustment takes place when people interact with culture.  Because people are different and the potential to change their wiring is low** it is imperative to select new employees who will embrace the existing culture.  If they can’t or won’t, or lack ability, they have to go.  Even with its stringent hiring practices, about a third of Bridgewater’s new hires are gone by the end of eighteen months.

Human relations are built on meaningful relationships, radical truth and tough love.  Meaningful relationships means people give more consideration to others than themselves and exhibit genuine caring for each other.  Radical truth means you are “transparent with your thoughts and open-mindedly accepting the feedback of others.” (p. 268)  Tough love recognizes that criticism is essential for improvement towards excellence; everyone in the organization is free to criticize any other member, no matter their position in the hierarchy.  People have an obligation to speak up if they disagree. 

“Great cultures bring problems and disagreements to the surface and solve them well . . .” (p. 299)  The culture should support a five-step management approach: Have clear goals, don’t tolerate problems, diagnose problems when they occur, design plans to correct the problems, and do what’s necessary to implement the plans, even if the decisions are unpopular.  The culture strives for excellence so it’s intolerant of folks who aren’t excellent and goal achievement is more important than pleasing others in the organization.

More on Management 

Dalio’s vision for Bridgewater is “an idea meritocracy in which meaningful work and meaningful relationships are the goals and radical truth and radical transparency are the ways of achieving them . . .” (p. 539)  An idea meritocracy is “a system that brings together smart, independent thinkers and has them productively disagree to come up with the best possible thinking and resolve their disagreements in a believability-weighted way . . .” (p. 308)  Radical truth means “not filtering one’s thoughts and one’s questions, especially the critical ones.” (ibid.)  Radical transparency means “giving most everyone the ability to see most everything.” (ibid.)

A person is a machine operating within a machine.  One must be one’s own machine designer and manager.  In managing people and oneself, take advantage of strengths and compensate for weaknesses via guardrails and soliciting help from others.  An example of a guardrail is assigning a team member whose strengths balance another member’s weaknesses.  People must learn from their own bad decisions so self-reflection after making a mistake is essential.  Managers must ascertain if mistakes are evidence of a weakness and whether compensatory action is required or, if the weakness is intolerable, termination.  Because values, abilities and skills are the drivers of behavior management should have a full profile for each employee.

Governance is the system of checks and balances in an organization.  No one is above the system, including the founder-owner.  In other words, senior managers like Dalio can be subject to the same criticism as any other employee.

Leadership in the traditional sense (“I say, you do”) is not so important in an idea meritocracy because the optimal decisions arise from a group process.  Managers are seen as decision makers, system designers and shapers who can visualize a better future and then build it.   Leaders “must be willing to recruit individuals who are willing to do the work that success requires.” (p. 520)

Our Perspective

We recognize international investment management is way different from nuclear power management so some of Dalio’s principles can only be applied to the nuclear industry in a limited way, if at all.  One obvious example of a lack of fit is the area of risk management.  The investing environment is extremely competitive with players evolving rapidly and searching for any edge.  Timely bets (investments) must be made under conditions where the risk of failure is many orders of magnitude greater than what acceptable in the nuclear industry.  Other examples include the relentless, somewhat ruthless, pursuit of goals and a willingness to jettison people that is foreign to the utility world.

But we shouldn’t throw the baby out with the bath.  While Dalio’s approach may be too extreme for wholesale application in your environment it does provide a comparison (note we don’t say “standard”) for your organization’s performance.  Does your decision making process measure up to Dalio’s in terms of robustness, transparency and the pursuit of truth?  Does your culture really strive for excellence (and eliminate those who don’t share that vision) or is it an effort constrained by hierarchical, policy or political realities?

This is a long book but it’s easy to read and key points are repeated often.  Not all of it is novel; many of the principles are based on observations or techniques that have been around for awhile and should be familiar to you.  For example, ideas about how human minds work are drawn, in part, from Daniel Kahneman; an integrated hierarchy of goals looks like Management by Objectives; and a culture that doesn’t automatically punish people for making mistakes or tolerable errors sounds like a “just culture” albeit with some mandatory individual learning attached.

Bottom line: Give this book a quick look.  It can’t hurt and might help you get a clearer picture of how your own organization actually operates.

*  R. Dalio, Principles (New York: Simon & Schuster, 2017).  This book was recommended to us by a Safetymatters reader.  Please contact us if you have any material you would like us to review.

**  A person’s basic values and abilities are relatively fixed, although skills may be improved through training.

Friday, December 1, 2017

Nuclear Safety Culture: Focus on Decision Making

McKinsey Five Fifty cover
We have long held that decision making (DM) is a key artifact reflecting the state of a nuclear organization’s safety culture.

The McKinsey Quarterly (MQ) has packaged a trio of articles* on DM.  Their first purpose is identifying and countering the different biases that lead to sub-optimal, even disastrous decisions.  (When specific biases are widely spread in an organization, they are part of its culture.)  A second purpose is to describe the attributes of more fair, robust and effective DM processes.  The articles’ specific topics are (1) the behavioral science that underlies DM, (2) a method for categorizing and processing decisions and (3) a case study of a major utility that changed its decision culture. 

“The case for behavioral strategy” (MQ, March 2010)

This article covers the insights from psychology that can be used to fashion a robust DM process.  The authors evidence the need for process improvement by reporting their survey research results showing over 50 percent of the variability in decisional results (i.e., performance) was determined by the quality of the DM process while less than 10 percent was caused by the quality of the underlying analysis. 

There are plenty of cognitive biases that can affect human DM.  The authors discuss several of them and strategies for counteracting them, as summarized in the table below.

Type of bias
How to counteract
False pattern recognition (e.g., saliency (overweight recent or memorable events), confirmation, inaccurate analogies)
Require alternative explanations for the data in the analysis, articulate participants’ relevant experiences (which can reveal the basis for their biases), identify similar situations for comparative analysis.
Bias for action
Explicitly consider uncertainty in the input data and the possible outcomes.
Stability (anchoring to an initial value, loss aversion)
Establish stretch targets that can’t be achieved by business as usual.
Silo thinking
Involve a diverse group in the DM process and define specific decision criteria before discussions begin.
Social (conformance to group views)
Create genuine debate through a diverse set of decision makers, a climate of trust and depersonalized discussions.

The greatest problem arises from biases that create repeatable patterns that become undesirable cultural traits.  DM process designers must identify the types of biases that arise in their organization’s DM, and specify debiasing techniques that will work in their organization and embed them in formal DM procedures.

An attachment to the article identifies and defines 17 specific biases.  Much of the seminal research on DM biases was performed by Daniel Kahneman who received a Nobel prize for his efforts.  We have reviewed Prof. Kahneman’s work on Safetymatters; see our Nov. 4, 2011 and Dec. 18, 2013 posts or click on the Kahneman label. 

“Untangling your organization’s decision making” (MQ, June 2017)

While this article is aimed at complex, global organizations, there are lessons here for nuclear organizations (typically large bureaucracies) because all organizations have become victims of over-abundant communications, with too many meetings and low value e-mail threads distracting members from paying attention to making good decisions.

The authors posit four types of decisions an organization faces, plotted on a 2x2 matrix (the consultant’s best friend) with scope and impact (broad or narrow) on one axis and level of familiarity (infrequent or frequent) on the other.  A different DM approach is proposed for each quadrant. 

Big-bet decisions are infrequent and have broad impact.  Recommendations include (1) ensure there’s an executive sponsor, (2) break down the mega-decision into manageable parts for analysis (and reassemble them later), (3) use a standard DM approach for all the parts and (4) establish a mechanism to track effectiveness during decision implementation.

The authors observe that some decisions turn out to be “bet the company” ones without being recognized as such.  There are examples of this in the nuclear industry.  For details, see our June 18,2013 post on Kewaunee (had only an 8 year PPA), Crystal River (tried to cut through the containment using in-house expertise) and SONGs (installed replacement steam generators with an unproven design). 

Cross-cutting decisions are more frequent and have broad impact.  Some decisions at a nuclear power plant fall into this category.  They need to have the concurrence and support of the Big 3 stakeholders (Operations, Engineering and Maintenance).  Silo attitudes are an omnipresent threat to success in making these kinds of decisions.  The key is to get the stakeholders to agree on the main process steps and define them in a plain-English procedure that defines the calendar, handoffs and decisions.  Governing policy should establish the DM bodies and their authority, and define shared performance metrics to measure success. 

Delegated decisions are frequent and low-risk.  They can be effectively handled by an individual or working team, with limited input from others.  The authors note “The role-modeling of senior leaders is invaluable, but they may be reluctant” to delegate.  We agree.  In our experience, many nuclear managers were hesitant to delegate as many decisions as they could have to subordinates.  Their fear of being held accountable for a screw-up was just too great.  However, their goal should have been to delegate all decisions except those for which they alone had the capabilities and accountability.  Subordinates need appropriate training and explicit authority to make their decisions and they need to be held accountable by higher-level managers.  The organization needs to establish a clear policy defining when and how a decision should be elevated to a more senior decision maker. 

Ad hoc decisions are infrequent and low-risk; they were deliberately omitted from the article. 

“A case study in combating bias” (MQ, May 2017)

This is an interview with a senior executive of a German utility that invested €10 billion in conventional power projects, investments that failed when the political-economic environment evolved in a direction opposite to their assumptions.  In their postmortem, they realized they had succumbed to several cognitive biases, including status quo, confirmation, champion and sunflower.  The sunflower bias (groups aligning with their leaders) stretched far down the organizational hierarchy so lower-level analysts didn’t dare to suggest contrary assumptions or outcomes.

The article describes how the utility made changes to their DM practices to promote awareness of biases and implement debiasing techniques, e.g, one key element is officially designated “devil’s advocates” in DM groups.  Importantly, training emphasizes that biases are not some personal defect but “just there,” i.e., part of the culture.  The interviewee noted that the revised process is very time-intensive so it is utilized only for the most important decisions facing each user group. 

Our Perspective 

The McKinsey content describes executive level, strategic DM but many of the takeaways are equally applicable to decisions made at the individual, department and inter-department level, where a consistent approach is perhaps even more important in maintaining or improving organizational performance.

The McKinsey articles come in one of their Five Fifty packages, with a summary you can review in five minutes and the complete articles that may take fifty minutes total.  You should invest at least the smaller amount.

*  “Better Decisions,” McKinsey Quarterly Five Fifty.  Retrieved Nov. 28, 2017.