Tuesday, February 23, 2016

The Dark Side of Culture Management: Functional Stupidity

Culture is the collection of values and assumptions that underlie organizational decisions and other actions.  We have long encouraged organizations to develop strong safety cultures (SC).  The methods available to do this are widely-known, including visible leadership and role models; safety-related policies, practices and procedures; supportive structures like an Employee Concerns Program; the reward and recognition system; training and oversight; and regulatory carrots and sticks.

Because safety performance alone does not pay the bills, organizations also need to achieve their intended economic goals (i.e., be effective) and operate efficiently.  Most of the methods that can be used to promote SC can also be used to promote the overall performance culture.

What happens when the organization goes too far in shaping its culture to optimize performance?  One possibility, according to a 2012 Journal of Management Studies article*, is a culture of Functional Stupidity.  The Functional part means the organization meets its goals and operates efficiently and Stupidity “is an organizationally supported inability or unwillingness to mobilize one’s cognitive capacities.” (p. 1199)**

More specifically, to the extent management, through its power and/or leadership, willfully shapes an organization’s value structure to achieve greater functionality (conformity, focus, efficiency, etc.) they may be, consciously or unconsciously, creating an environment where employees ask fewer questions (and no hard ones), seek fewer justifications for the organization’s decisions or actions, focus their intelligence in the organization’s defined areas, do not reflect on their roles in the organization’s undertakings, and essentially go along with the program.  Strong leaders set the agenda and the true followers, well, they follow.

In the name of increased functionality, such actions can create a Weltanschauung that is narrowly focused and self-justifying.  It may result in soft biases, e.g., production over safety, or ignoring problematic aspects of a situation, e.g., Davis-Besse test and inspection reports.

Fortunately, as the authors explain, a self-correcting dynamic may occur.  Initially, improved functionality contributes to a sense of certainty about the organization’s and individuals’ places in the world, thus creating positive feedback.  But eventually the organization’s view of the world may increasingly clash with reality, creating dissonance (a loss of certainty) for the organization and the individuals who inhabit it.  As the gap between perception and reality grows, the overall system becomes less stable.  When people realize that description and reality are far apart, the organization’s, i.e., management’s, legitimacy collapses.

However, in the worst case “increasingly yawning gaps between shared assumptions and reality may eventually produce accidents or disasters.” (p. 1213)  Fukushima anyone?

Our Perspective

Management is always under the gun to “do better” when things are going well or “do something” when problems occur.  In the latter case, one popular initiative is to “improve” the culture, especially if a regulator is involved.  Although management’s intentions may be beneficent, there is an opportunity for invidious elements to be introduced and/or unintended consequences to occur.

Environmental factors can encourage stupidity.  For example, quarterly financial reporting, an ever shortening media cycle and the global reach of the Internet (especially it’s most intellectually challenged component, the Twitterverse) pressure executives to project command of their circumstances and certainty about their comprehension, even if they lack adequate (or any) relevant data.

The nuclear industry is not immune to functional stupidity.  Not to put too fine a point on it, but the industry's penchant for secrecy creates an ideal Petri dish for the cultivation of stupidity management.

The authors close by saying “we hope to prompt wider debate about why it is that smart organizations can be so stupid at times.” (p. 1216)  For a long time we have wondered about that ourselves.


*  M. Alvesson and A. Spicer, “A Stupidity-Based Theory of Organizations,” Journal of Management Studies 49:7 (Nov. 2012), pp. 1194-1220.  Thanks to Madalina Tronea for publicizing this document.  Dr. Tronea is the founder/moderator of the LinkedIn Nuclear Safety Culture discussion group.

**  Following are additional definitions, with italics added, “functional stupidity is inability and/or unwillingness to use cognitive and reflective capacities in anything other than narrow and circumspect ways.” (p. 1201)  “stupidity management . . . involves the management of consciousness, clues about how to understand and relate to the world, . . .” (p. 1204)  “stupidity self-management  [consists of] the individual putting aside doubts, critique, and other reflexive concerns and focusing on the more positive aspects of organizational life which are more clearly aligned with understandings and interpretations that are officially sanctioned and actively promoted.” (p. 1207)

Tuesday, February 16, 2016

DOE Inspector General Piles On: Bechtel CAP and DOE Oversight Deficient at the Vit Plant

The Department of Energy (DOE) Inspector General (IG) recently released an audit report* on deficiencies in Bechtel’s Corrective Action Program (CAP) at the Hanford Waste Treatment Plant (WTP aka the Vit Plant) where Bechtel is the prime contractor.  The report also described deficiencies in the DOE Office of River Protection’s (ORP) oversight of Bechtel.

With one exception, this IG report is not about safety culture (SC) per se, but it does discuss two key artifacts that reflect the strength of a SC: the effectiveness of the CAP and the size of backlogs.**

The audit found that the Bechtel CAP “was not fully effective in managing and resolving issues.”  Specifically, some required issues were not managed and tracked in the CAP, corrective actions were not implemented in a timely manner (Bechtel did not make any of its timeliness goals) and Bechtel failed to follow through on implementing or sustaining prior CAP improvement initiatives. (pp. 1-2, 5)

The findings were not news to the ORP.  In fact, they are consistent with ORP’s 2013 audit of Bechtel’s Quality Assurance program.  At that time ORP directed Bechtel to make CAP improvements but as of the current IG audit, such improvements had not been fully implemented. (p. 2) 

CAP backlogs are also a problem.  Backlogs of condition reports increased from 2013 to 2014, as did the age of corrective actions. (pp. 4-5)

The audit report does have one direct tie to SC, noting that Bechtel identified weaknesses in its SC in 2014, including concerns about management not valuing a rigorous CAP. (p. 6)

And the auditors didn’t let ORP off the hook, stating DOE “did not ensure that all technical issues and issues identified through self-assessments were entered into the [CAP].  Finally, [DOE] did not ensure that previous Bechtel initiatives to address [DOE] implementation problems were fully implemented or sustained.” (p. 6)

The report closed with three straightforward “fix-it” recommendations with which ORP management concurred.  In their concurrence letter, ORP reviews the actions taken to date and concludes “Bechtel has strengthened the WTP Project’s nuclear safety and quality culture.” (p. 11)

Our Perspective

The report does not inspire confidence that Bechtel can upgrade its CAP (while trying to move ahead with Vit Plant design and construction) or ORP will ride herd on them to ensure it happens.  In fact, the report is consistent with a bevy of earlier assessments and evaluations, many of which we have reviewed on Safetymatters.  (Click on the Vit Plant label for more details.)  ORP’s assertion that Bechtel has strengthened its culture is possibly true, but they began from an unacceptably low starting point.

Early in my career I was hired as a Quality Control manager for a telecom manufacturer.  The company had major problems with its flagship product and I was soon named to a task force to investigate them.  On my way to our initial meeting, I met up with a more senior employee and told him how I looked forward to our task force identifying and fixing the product’s problems.  He turned to me and said “The first three didn’t.”  Welcome to the world.


*  U.S. DOE Inspector General, “Audit Report - Corrective Action Program at the Waste Treatment and Immobilization Plant,” OAI-M-16-06 (Feb. 2016).

**  As we have discussed elsewhere, two other key artifacts are decision-making and compensation.  From the WTP history we have reviewed for Safetymatters, it appears Bechtel (and by extension, DOE) decision-making does not effectively address either the tough technical challenges or programmatic issues at the WTP.  The Bechtel contract now includes some modest incentive compensation for SC performance.  We discussed that program on Dec. 29, 2014.

Wednesday, February 10, 2016

NEA’s Safety Culture Guidance for Nuclear Regulators

A recent Nuclear Energy Agency (NEA) publication* describes desirable safety culture (SC) characteristics for a nuclear regulator.  Its purpose is to provide a benchmark for both established and nascent regulatory bodies.

The document’s goal is to describe a “healthy” SC.  It starts with the SC definition in INSAG-4** then posits five principles for an effective nuclear regulator: Safety leadership is demonstrated at all levels; regulatory staff set the standard for safety; and the regulatory body facilitates co-operation and open communication, implements a holistic approach to safety, and encourages continuous improvement, learning and self-assessment.

The principle that caught our attention is the holistic (or systemic) approach to safety.  This approach is discussed multiple times in the document.  In the Introduction, the authors say the regulator
should actively scrutinise how its own safety culture impacts the licensees’ safety culture.  It should also reflect on its role within the wider system and on how its own culture is the result of its interactions with the licensees and all other stakeholders.” (p. 12)

A subsequent chapter contains a more expansive discussion of each principle and identifies relevant attributes.  The following excerpts illustrate the value of a holistic approach.  “A healthy safety culture is dependent on the regulatory body using a robust, holistic, multi-disciplinary approach to safety.  Regulators oversee and regulate complex socio-technical systems that, together with the regulatory body itself, form part of a larger system made up of many stakeholders, with competing as well as common interests.  All the participants in this system influence and react to each other, and there is a need for awareness and understanding of this mutual influence.” (p. 19)

“[T]he larger socio-technical system [is] influenced by technical, human and organisational, environmental, economic, political and societal factors [including national culture].  Regulators should strive to do more than simply establish standards; they should consider the performance of the entire system that ensures safety.” (p. 20)

And “Safety issues are complex and involve a number or inter-related factors, activities and groups, whose importance and effect on each other and on safety might not be immediately recognisable.” (ibid.)

The Conclusions include the following: “Regulatory decisions need to consider the performance and response of the entire system delivering safety, how the different parts of the system are coupled and the direction the system is taking.” (p. 28)

Our Perspective

Much of this material in this publication will be familiar to Safetymatters readers*** but the discussion of a holistic approach to regulation is more extensive than we’ve seen elsewhere.  For that reason alone, we think this document is worth your quick review.  We have been promoting a systems view of the nuclear industry, from individual power plants to the overall socio-technical-legal-political construct, for years. 

The committee that developed the guidance consisted of almost thirty members from over a dozen countries, the International Atomic Energy Agency and NEA itself.  It’s interesting that China was not represented on the committee although it has world's largest nuclear power plant construction program**** and, one would hope, substantial interest in effective safety regulation and safety culture.  (Ooops!  China is not a member of the NEA.  Does that say something about China's perception of the NEA's value proposition?)


*  Nuclear Energy Agency, “The Safety Culture of an Effective Nuclear Regulatory Body” (2016).  Thanks to Madalina Tronea for publicizing this document.  Dr. Tronea is the founder/moderator of the LinkedIn Nuclear Safety Culture discussion group.  The NEA is an arm of the Organisation for Economic Co-operation and Development (OECD).

**  International Nuclear Safety Advisory Group, “Safety Culture,” Safety Series No. 75-INSAG-4, (Vienna: IAEA, 1991), p. 4.

***  For example, the list of challenges a regulator faces includes the usual suspects: maintain the focus on safety, avoid complacency, resist external pressures, avoid regulatory capture and maintain technical competence. (pp. 23-25)

****  “China has world's largest nuclear power capacity under construction,” China Daily (Dec. 30, 2015).

Tuesday, February 2, 2016

Ethics, Individual Misconduct and Organizational Culture

Ethics* are rules for conduct or behavior.  They are basically a social (as opposed to an individual or psychological) construct and part of group or organizational culture.  They can be very specific do’s and don’ts or more general guidelines for behavior.

The Ethics Resource Center (ERC) conducts and publishes regular employee surveys on the degree of non-compliance with workplace ethics in the United States.  The surveys focus on instances of misconduct observed by workers—what occurred, who did it, who reported it (if anyone) and what happened to the reporter. 

The survey uses a random sample of employees in the for-profit sector.  The 2013 survey** ended up with over 6,000 useable responses.  There is no indication how many respondents, if any, work in the commercial nuclear industry.

The overall findings are interesting.  On the positive side, “Observed misconduct*** is down for the third report in a row and is now at a historic low; the decline in misconduct is widespread; and the percentage of workers who said they felt pressure to compromise standards also fell substantially.” (p.12)

But problems persist.  “Workers reported that 60 percent of misconduct involved someone with managerial authority from the supervisory level up to top management.  Nearly a quarter (24 percent) of observed misdeeds involved senior managers.  Perhaps equally troubling, workers said that 26 percent of misconduct is ongoing within their organization.  About 12 percent of wrongdoing was reported to take place company-wide.” (ibid.)

The reporting of misconduct problems has both good and bad news.  Lots of workers (63%) who observed misconduct reported it but 21% of those who reported misconduct said they experienced retaliation**** in return. (p. 13)

The report goes on to examine the details behind the summary results and attempts to assign some possible causes to explain observed trends.  For example, the authors believe it’s probable that positive trends are related to companies’ ethics and compliance programs that create new norms for worker conduct, i.e., a stronger culture. (p. 16)  And a stronger culture is desirable.  Returning to the survey, “In 2013, one in five workers (20 percent) reported seeing misconduct in companies where cultures are “strong” compared to 88 percent who witnessed wrongdoing in companies with the weakest cultures.” (p. 18)

The keys to building a stronger ethical culture are familiar to Safetymatters readers: top-level role models, and support by immediate supervisors and peers to do the right thing.  In terms of cultural artifacts, a stronger ethical culture is visible in an organization’s processes for training, personnel evaluation and application of employee discipline. 

The report goes on to analyze misconduct in depth—who is doing it, what are they doing and how long it has been going on.  The authors cover how and why employees report misconduct and suggest ways to increase the reporting rate.  They note that increased legal protection for whistleblowers has increased the likelihood that covered workers will report misconduct.

Our Perspective

This report is worth a read.  Quite frankly, more workers are willing to report misconduct than I would have predicted.  The percentage of reporters who perceive retaliation is disappointing but hardly surprising.
 

The survey results are more interesting than the explanatory analysis; a reader should keep in mind that this research was conducted by a group that has a vested self-interest in finding the "correct" answers. 

Because specific firms and industries are not identified, it’s easy to blow off the results with a flip “Didn’t happen here and can’t happen here because we have a robust SCWE and ECP.”  I suggest such parochial reviewers keep in mind that “Pride goes before destruction, and a haughty spirit before a fall..”*****


*  Ethics and morals are often used interchangeably but it’s helpful to consider morals as an individual construct, a person’s inner principles of right and wrong.  See diffen.com for a more detailed comparison.

**  Ethics Resource Center, “2013 National Business Ethics Survey of the U.S. Workforce” (Arlington, VA: 2014).  Corporate sponsors include firms familiar to nuclear industry participants, e.g., Bechtel and Edison International.

***  The survey identified 28 specific types of misconduct.  Some of interest to the nuclear industry, listed in the order of frequency of occurrence in the survey responses, include abusive behavior or behavior that creates a hostile work environment, lying to employees, discriminating against employees, violations of health or safety regulations, lying to the public, retaliation against someone who has reported misconduct, abusing substances at work, sexual harassment, violation of environmental regulations and falsifying books and/or records. (pp. 41-42)

****  The survey also identified 13 specific types of retaliation experienced by whistleblowers including being ignored or treated differently by supervisors or other employees, being excluded from decisions, verbal abuse, not receiving promotions or raises, reduced hours or pay, relocation or reassignment, harassment at home or online and physical harm to one’s person or property. (p. 45)

*****  Proverbs 16:18, Bible (English Standard Version).