Tuesday, February 23, 2016

The Dark Side of Culture Management: Functional Stupidity

Culture is the collection of values and assumptions that underlie organizational decisions and other actions.  We have long encouraged organizations to develop strong safety cultures (SC).  The methods available to do this are widely-known, including visible leadership and role models; safety-related policies, practices and procedures; supportive structures like an Employee Concerns Program; the reward and recognition system; training and oversight; and regulatory carrots and sticks.

Because safety performance alone does not pay the bills, organizations also need to achieve their intended economic goals (i.e., be effective) and operate efficiently.  Most of the methods that can be used to promote SC can also be used to promote the overall performance culture.

What happens when the organization goes too far in shaping its culture to optimize performance?  One possibility, according to a 2012 Journal of Management Studies article*, is a culture of Functional Stupidity.  The Functional part means the organization meets its goals and operates efficiently and Stupidity “is an organizationally supported inability or unwillingness to mobilize one’s cognitive capacities.” (p. 1199)**

More specifically, to the extent management, through its power and/or leadership, willfully shapes an organization’s value structure to achieve greater functionality (conformity, focus, efficiency, etc.) they may be, consciously or unconsciously, creating an environment where employees ask fewer questions (and no hard ones), seek fewer justifications for the organization’s decisions or actions, focus their intelligence in the organization’s defined areas, do not reflect on their roles in the organization’s undertakings, and essentially go along with the program.  Strong leaders set the agenda and the true followers, well, they follow.

In the name of increased functionality, such actions can create a Weltanschauung that is narrowly focused and self-justifying.  It may result in soft biases, e.g., production over safety, or ignoring problematic aspects of a situation, e.g., Davis-Besse test and inspection reports.

Fortunately, as the authors explain, a self-correcting dynamic may occur.  Initially, improved functionality contributes to a sense of certainty about the organization’s and individuals’ places in the world, thus creating positive feedback.  But eventually the organization’s view of the world may increasingly clash with reality, creating dissonance (a loss of certainty) for the organization and the individuals who inhabit it.  As the gap between perception and reality grows, the overall system becomes less stable.  When people realize that description and reality are far apart, the organization’s, i.e., management’s, legitimacy collapses.

However, in the worst case “increasingly yawning gaps between shared assumptions and reality may eventually produce accidents or disasters.” (p. 1213)  Fukushima anyone?

Our Perspective

Management is always under the gun to “do better” when things are going well or “do something” when problems occur.  In the latter case, one popular initiative is to “improve” the culture, especially if a regulator is involved.  Although management’s intentions may be beneficent, there is an opportunity for invidious elements to be introduced and/or unintended consequences to occur.

Environmental factors can encourage stupidity.  For example, quarterly financial reporting, an ever shortening media cycle and the global reach of the Internet (especially it’s most intellectually challenged component, the Twitterverse) pressure executives to project command of their circumstances and certainty about their comprehension, even if they lack adequate (or any) relevant data.

The nuclear industry is not immune to functional stupidity.  Not to put too fine a point on it, but the industry's penchant for secrecy creates an ideal Petri dish for the cultivation of stupidity management.

The authors close by saying “we hope to prompt wider debate about why it is that smart organizations can be so stupid at times.” (p. 1216)  For a long time we have wondered about that ourselves.


*  M. Alvesson and A. Spicer, “A Stupidity-Based Theory of Organizations,” Journal of Management Studies 49:7 (Nov. 2012), pp. 1194-1220.  Thanks to Madalina Tronea for publicizing this document.  Dr. Tronea is the founder/moderator of the LinkedIn Nuclear Safety Culture discussion group.

**  Following are additional definitions, with italics added, “functional stupidity is inability and/or unwillingness to use cognitive and reflective capacities in anything other than narrow and circumspect ways.” (p. 1201)  “stupidity management . . . involves the management of consciousness, clues about how to understand and relate to the world, . . .” (p. 1204)  “stupidity self-management  [consists of] the individual putting aside doubts, critique, and other reflexive concerns and focusing on the more positive aspects of organizational life which are more clearly aligned with understandings and interpretations that are officially sanctioned and actively promoted.” (p. 1207)

Tuesday, February 16, 2016

DOE Inspector General Piles On: Bechtel CAP and DOE Oversight Deficient at the Vit Plant

The Department of Energy (DOE) Inspector General (IG) recently released an audit report* on deficiencies in Bechtel’s Corrective Action Program (CAP) at the Hanford Waste Treatment Plant (WTP aka the Vit Plant) where Bechtel is the prime contractor.  The report also described deficiencies in the DOE Office of River Protection’s (ORP) oversight of Bechtel.

With one exception, this IG report is not about safety culture (SC) per se, but it does discuss two key artifacts that reflect the strength of a SC: the effectiveness of the CAP and the size of backlogs.**

The audit found that the Bechtel CAP “was not fully effective in managing and resolving issues.”  Specifically, some required issues were not managed and tracked in the CAP, corrective actions were not implemented in a timely manner (Bechtel did not make any of its timeliness goals) and Bechtel failed to follow through on implementing or sustaining prior CAP improvement initiatives. (pp. 1-2, 5)

The findings were not news to the ORP.  In fact, they are consistent with ORP’s 2013 audit of Bechtel’s Quality Assurance program.  At that time ORP directed Bechtel to make CAP improvements but as of the current IG audit, such improvements had not been fully implemented. (p. 2) 

CAP backlogs are also a problem.  Backlogs of condition reports increased from 2013 to 2014, as did the age of corrective actions. (pp. 4-5)

The audit report does have one direct tie to SC, noting that Bechtel identified weaknesses in its SC in 2014, including concerns about management not valuing a rigorous CAP. (p. 6)

And the auditors didn’t let ORP off the hook, stating DOE “did not ensure that all technical issues and issues identified through self-assessments were entered into the [CAP].  Finally, [DOE] did not ensure that previous Bechtel initiatives to address [DOE] implementation problems were fully implemented or sustained.” (p. 6)

The report closed with three straightforward “fix-it” recommendations with which ORP management concurred.  In their concurrence letter, ORP reviews the actions taken to date and concludes “Bechtel has strengthened the WTP Project’s nuclear safety and quality culture.” (p. 11)

Our Perspective

The report does not inspire confidence that Bechtel can upgrade its CAP (while trying to move ahead with Vit Plant design and construction) or ORP will ride herd on them to ensure it happens.  In fact, the report is consistent with a bevy of earlier assessments and evaluations, many of which we have reviewed on Safetymatters.  (Click on the Vit Plant label for more details.)  ORP’s assertion that Bechtel has strengthened its culture is possibly true, but they began from an unacceptably low starting point.

Early in my career I was hired as a Quality Control manager for a telecom manufacturer.  The company had major problems with its flagship product and I was soon named to a task force to investigate them.  On my way to our initial meeting, I met up with a more senior employee and told him how I looked forward to our task force identifying and fixing the product’s problems.  He turned to me and said “The first three didn’t.”  Welcome to the world.


*  U.S. DOE Inspector General, “Audit Report - Corrective Action Program at the Waste Treatment and Immobilization Plant,” OAI-M-16-06 (Feb. 2016).

**  As we have discussed elsewhere, two other key artifacts are decision-making and compensation.  From the WTP history we have reviewed for Safetymatters, it appears Bechtel (and by extension, DOE) decision-making does not effectively address either the tough technical challenges or programmatic issues at the WTP.  The Bechtel contract now includes some modest incentive compensation for SC performance.  We discussed that program on Dec. 29, 2014.

Wednesday, February 10, 2016

NEA’s Safety Culture Guidance for Nuclear Regulators

A recent Nuclear Energy Agency (NEA) publication* describes desirable safety culture (SC) characteristics for a nuclear regulator.  Its purpose is to provide a benchmark for both established and nascent regulatory bodies.

The document’s goal is to describe a “healthy” SC.  It starts with the SC definition in INSAG-4** then posits five principles for an effective nuclear regulator: Safety leadership is demonstrated at all levels; regulatory staff set the standard for safety; and the regulatory body facilitates co-operation and open communication, implements a holistic approach to safety, and encourages continuous improvement, learning and self-assessment.

The principle that caught our attention is the holistic (or systemic) approach to safety.  This approach is discussed multiple times in the document.  In the Introduction, the authors say the regulator
should actively scrutinise how its own safety culture impacts the licensees’ safety culture.  It should also reflect on its role within the wider system and on how its own culture is the result of its interactions with the licensees and all other stakeholders.” (p. 12)

A subsequent chapter contains a more expansive discussion of each principle and identifies relevant attributes.  The following excerpts illustrate the value of a holistic approach.  “A healthy safety culture is dependent on the regulatory body using a robust, holistic, multi-disciplinary approach to safety.  Regulators oversee and regulate complex socio-technical systems that, together with the regulatory body itself, form part of a larger system made up of many stakeholders, with competing as well as common interests.  All the participants in this system influence and react to each other, and there is a need for awareness and understanding of this mutual influence.” (p. 19)

“[T]he larger socio-technical system [is] influenced by technical, human and organisational, environmental, economic, political and societal factors [including national culture].  Regulators should strive to do more than simply establish standards; they should consider the performance of the entire system that ensures safety.” (p. 20)

And “Safety issues are complex and involve a number or inter-related factors, activities and groups, whose importance and effect on each other and on safety might not be immediately recognisable.” (ibid.)

The Conclusions include the following: “Regulatory decisions need to consider the performance and response of the entire system delivering safety, how the different parts of the system are coupled and the direction the system is taking.” (p. 28)

Our Perspective

Much of this material in this publication will be familiar to Safetymatters readers*** but the discussion of a holistic approach to regulation is more extensive than we’ve seen elsewhere.  For that reason alone, we think this document is worth your quick review.  We have been promoting a systems view of the nuclear industry, from individual power plants to the overall socio-technical-legal-political construct, for years. 

The committee that developed the guidance consisted of almost thirty members from over a dozen countries, the International Atomic Energy Agency and NEA itself.  It’s interesting that China was not represented on the committee although it has world's largest nuclear power plant construction program**** and, one would hope, substantial interest in effective safety regulation and safety culture.  (Ooops!  China is not a member of the NEA.  Does that say something about China's perception of the NEA's value proposition?)


*  Nuclear Energy Agency, “The Safety Culture of an Effective Nuclear Regulatory Body” (2016).  Thanks to Madalina Tronea for publicizing this document.  Dr. Tronea is the founder/moderator of the LinkedIn Nuclear Safety Culture discussion group.  The NEA is an arm of the Organisation for Economic Co-operation and Development (OECD).

**  International Nuclear Safety Advisory Group, “Safety Culture,” Safety Series No. 75-INSAG-4, (Vienna: IAEA, 1991), p. 4.

***  For example, the list of challenges a regulator faces includes the usual suspects: maintain the focus on safety, avoid complacency, resist external pressures, avoid regulatory capture and maintain technical competence. (pp. 23-25)

****  “China has world's largest nuclear power capacity under construction,” China Daily (Dec. 30, 2015).

Tuesday, February 2, 2016

Ethics, Individual Misconduct and Organizational Culture

Ethics* are rules for conduct or behavior.  They are basically a social (as opposed to an individual or psychological) construct and part of group or organizational culture.  They can be very specific do’s and don’ts or more general guidelines for behavior.

The Ethics Resource Center (ERC) conducts and publishes regular employee surveys on the degree of non-compliance with workplace ethics in the United States.  The surveys focus on instances of misconduct observed by workers—what occurred, who did it, who reported it (if anyone) and what happened to the reporter. 

The survey uses a random sample of employees in the for-profit sector.  The 2013 survey** ended up with over 6,000 useable responses.  There is no indication how many respondents, if any, work in the commercial nuclear industry.

The overall findings are interesting.  On the positive side, “Observed misconduct*** is down for the third report in a row and is now at a historic low; the decline in misconduct is widespread; and the percentage of workers who said they felt pressure to compromise standards also fell substantially.” (p.12)

But problems persist.  “Workers reported that 60 percent of misconduct involved someone with managerial authority from the supervisory level up to top management.  Nearly a quarter (24 percent) of observed misdeeds involved senior managers.  Perhaps equally troubling, workers said that 26 percent of misconduct is ongoing within their organization.  About 12 percent of wrongdoing was reported to take place company-wide.” (ibid.)

The reporting of misconduct problems has both good and bad news.  Lots of workers (63%) who observed misconduct reported it but 21% of those who reported misconduct said they experienced retaliation**** in return. (p. 13)

The report goes on to examine the details behind the summary results and attempts to assign some possible causes to explain observed trends.  For example, the authors believe it’s probable that positive trends are related to companies’ ethics and compliance programs that create new norms for worker conduct, i.e., a stronger culture. (p. 16)  And a stronger culture is desirable.  Returning to the survey, “In 2013, one in five workers (20 percent) reported seeing misconduct in companies where cultures are “strong” compared to 88 percent who witnessed wrongdoing in companies with the weakest cultures.” (p. 18)

The keys to building a stronger ethical culture are familiar to Safetymatters readers: top-level role models, and support by immediate supervisors and peers to do the right thing.  In terms of cultural artifacts, a stronger ethical culture is visible in an organization’s processes for training, personnel evaluation and application of employee discipline. 

The report goes on to analyze misconduct in depth—who is doing it, what are they doing and how long it has been going on.  The authors cover how and why employees report misconduct and suggest ways to increase the reporting rate.  They note that increased legal protection for whistleblowers has increased the likelihood that covered workers will report misconduct.

Our Perspective

This report is worth a read.  Quite frankly, more workers are willing to report misconduct than I would have predicted.  The percentage of reporters who perceive retaliation is disappointing but hardly surprising.
 

The survey results are more interesting than the explanatory analysis; a reader should keep in mind that this research was conducted by a group that has a vested self-interest in finding the "correct" answers. 

Because specific firms and industries are not identified, it’s easy to blow off the results with a flip “Didn’t happen here and can’t happen here because we have a robust SCWE and ECP.”  I suggest such parochial reviewers keep in mind that “Pride goes before destruction, and a haughty spirit before a fall..”*****


*  Ethics and morals are often used interchangeably but it’s helpful to consider morals as an individual construct, a person’s inner principles of right and wrong.  See diffen.com for a more detailed comparison.

**  Ethics Resource Center, “2013 National Business Ethics Survey of the U.S. Workforce” (Arlington, VA: 2014).  Corporate sponsors include firms familiar to nuclear industry participants, e.g., Bechtel and Edison International.

***  The survey identified 28 specific types of misconduct.  Some of interest to the nuclear industry, listed in the order of frequency of occurrence in the survey responses, include abusive behavior or behavior that creates a hostile work environment, lying to employees, discriminating against employees, violations of health or safety regulations, lying to the public, retaliation against someone who has reported misconduct, abusing substances at work, sexual harassment, violation of environmental regulations and falsifying books and/or records. (pp. 41-42)

****  The survey also identified 13 specific types of retaliation experienced by whistleblowers including being ignored or treated differently by supervisors or other employees, being excluded from decisions, verbal abuse, not receiving promotions or raises, reduced hours or pay, relocation or reassignment, harassment at home or online and physical harm to one’s person or property. (p. 45)

*****  Proverbs 16:18, Bible (English Standard Version).

Monday, January 25, 2016

IAEA Urges Stronger Nuclear Safety Culture in Japan

Fukushima
The International Atomic Energy Agency (IAEA) recently completed a peer review of Japan's Nuclear Regulation Authority (NRA), a regulatory agency established in the aftermath of the 2011 Fukushima disaster.  Highlights of the review were discussed at an IAEA press conference.*

The IAEA review team praised the NRA’s progress in various areas, such as demonstrating independence and transparency, and made suggestions and recommendations for further improvement, primarily in the area of NRA staff recruiting and development.

The IAEA team also mentioned safety culture (SC), recommending “the NRA and nuclear licensees ‘continue to strengthen the promotion of safety culture, including by fostering a questioning attitude’.”

Our Perspective

We look forward to the IAEA’s final report which is due in about three months.  We are especially interested in seeing if there is comprehensive discussion and specific direction with respect to “fostering a questioning attitude.”  The Japanese nuclear industry in general and TEPCO (Fukushima’s owner) in particular certainly need to cultivate employees’ willingness to develop and consider open-ended questions such as “what if?” and “what can go wrong?”

More importantly, they also need to instill the necessary backbone to stand up in front of the bosses and ask tough questions and demand straight answers.  Lots of folks probably knew the Fukushima seawall wasn’t high enough and the emergency equipment in the basement was subject to flooding but everyone went along with the program.  That’s what has to change to create a stronger SC.


*  “IAEA praises reform of Japan's nuclear regulator,” World Nuclear News (Jan. 22, 2016).

Sunday, January 17, 2016

A Nuclear Safety Culture for Itinerant Workers

IAEA has published “Radiation Protection of Itinerant Workers”* a report that describes management responsibilities and practices to protect and monitor itinerant workers who are exposed to ionizing radiation.  “Itinerant workers” are people who work at different locations “and are not employees of the management of the facility where they are working. The itinerant workers may be self-employed or employed by a contractor . . .” (p. 4)  In the real world, such employees have many different names including nuclear nomads, glow boys and jumpers.

The responsibility for itinerant workers’ safety and protection is shared among various organizations and the individual.  “The primary responsibility for the protection of workers lies with the management of the operating organization responsible for the facilities . . . however, the employer of the worker (as well as the worker) also bear certain responsibilities.” (p. 2)

Safety culture (SC) is specifically mentioned in the IAEA report.  One basic management responsibility is to promote and maintain a robust SC at all organizational levels. (p. 11)  Specific responsibilities include providing general training in SC behavior and expectations (p. 131) and, where observation or problems reveal specific needs, targeted individual (or small group) SC training. (p. 93)

Our Perspective

This publication is hardly a great victory for SC; the report provides only the most basic description of the SC imperative.  Its major contribution is that it recognizes that itinerant nuclear workers deserve the same safety and protection considerations as other workers at a nuclear facility. 

Back in the bad old days, I was around nuclear organizations where their own employees represented the highest social class, contractors were regarded as replaceable parts, and nomadic workers were not exactly expendable but were considered more responsible for managing their own safety and exposure than permanent personnel.

One can make some judgment about a society’s worth by observing how it treats its lowest status members—the poor, the homeless, the refugee, the migrant worker.  Nuclear itinerant workers deserve to be respected and treated like the other members of a facility’s team.


*  International Atomic Energy Agency, “Radiation protection of itinerant workers,” Safety reports series no. 84 (Vienna, 2015).

Sunday, January 10, 2016

Targeted Safety Culture Assessment at Columbia Generating Station

Columbia Generating Station
The Columbia Generating Station (CGS) got into trouble with the NRC when two members of the security department were found to have been willfully inattentive to their duties on multiple occasions over three years (2012-2014).  What they were doing was not disclosed because it was a security-related matter.  The situation was summarized in a recent newspaper article* and the relevant NRC documents** provide some additional details.

Energy Northwest (CGS’s owner) opted for the NRC’s Alternative Dispute Resolution (ADR) process.  The agreed-upon corrective actions and penances are typical for ADR settlements: conduct a common cause evaluation, install new cameras and increase supervision if the cameras aren’t working, revise and present training, prepare a statement on willful misconduct’s consequences and have personnel sign it, prepare a "lessons learned" presentation for plant personnel and an industry gathering (aka public atonement), revise procedures and conduct a targeted nuclear safety culture (SC) assessment of the security organization at CGS.  Oh, and pay a $35K fine.

Our Perspective

The security SC assessment caught our eye because it is being conducted by a law firm, not a culture assessment specialist.  Maybe that’s because the subject is security-related, therefore sensitive, and this approach will ensure the report will never be made public.  It also ensures that the report will focus on the previously identified “bad apples” (who no longer work at the plant) and the agreed-upon ADR actions; the assessment will not raise any awkward management or systemic issues.


*  A. Cary, “Energy Northwest pays fine over Richland nuclear security,” Tri-City Herald (Jan. 5, 2015.)

**  A. Vegel (NRC) to M. Reddermann (Energy NW), Columbia Generating Station – NRC Security Inspection Report 05000397/2015405 and NRC Investigation Report No. 4-2014-009 (June 25, 2015).  ADAMS ML15176A599.  M. Dapas (NRC) to M. Reddermann (Energy NW), Confirmatory Order - NRC Security Inspection Report 05000397/2015407 AND NRC Investigation Report 4-2014-009 Columbia Generating Station (Sept. 28, 2015).  ADAMS ML15271A078.

Monday, January 4, 2016

How Top Management Decisions Shape Culture

A brief article* in the December 2015 The Atlantic magazine asks “What was VW thinking?” then reviews a few classic business cases to show how top management, often CEO, decisions can percolate down through an organization, sometimes with appalling results.  The author also describes a couple of mechanisms by which bad decision making can be institutionalized.  We’ll start with the cases.

Johnson & Johnson had a long-standing credo that outlined its responsibilities to those who used its products.  In 1979, the CEO reinforced the credo’s relevance to J&J’s operations.  When poisoned Tylenol showed up in stores, J&J did not hesitate to recall product, warn people against taking Tylenol and absorb a $100 million hit.  This is often cited as an example of a corporation doing the right thing. 

B. F. Goodrich promised an Air Force contractor an aircraft brake that was ultralight and ultracheap.  The only problem was it didn’t work, in fact it melted.  Only by massively finagling the test procedures and falsifying test results did they get the brake qualified.  The Air Force discovered the truth when they reviewed the raw test data.  A Goodrich whistleblower announced his resignation over the incident but was quickly fired by the company.  

Ford President Lee Iacocca wanted the Pinto to be light, inexpensive and available in 25 months.  The gas tank’s position made the vehicle susceptible to fire when the car was rear-ended but repositioning the gas tank would have delayed the roll-out schedule.  Ford delayed addressing the problem, resulting in at least one costly lawsuit and bad publicity for the company.

With respect to institutional mechanisms, the author reviews Diane Vaughan’s normalization of deviance and how it led to the space shuttle Challenger disaster.  To promote efficiency, organizations adopt scripts that tell members how to handle various situations.  Scripts provide a rationale for decisions, which can sometimes be the wrong decisions.  In Vaughan’s view, scripts can “expand like an elastic waistband” to accommodate more and more deviation from standards or norms.  Scripts are important organizational culture artifacts.  We have often referred to Vaughan’s work on Safetymatters.

The author closes with a quote: “Culture starts at the top, . . . Employees will see through empty rhetoric and will emulate the nature of top-management decision making . . . ”  The speaker?  Andrew Fastow, Enron’s former CFO and former federal prison inmate.

Our Perspective

I used to use these cases when I was teaching ethics to business majors at a local university.  Students would say they would never do any of the bad stuff.  I said they probably would, especially once they had mortgages (today it’s student debt), families and career aspirations.  It’s hard to put up a fight when the organization has so accepted the script they actually believe they are doing the right thing.  And don’t even think about being a whistleblower unless you’ve got money set aside and a good lawyer lined up.

Bottom line: This is worth a quick read.  It illustrates the importance of senior management’s decisions as opposed to its sloganeering or other empty leadership behavior.


*  J. Useem, “What Was Volkswagen Thinking?  On the origins of corporate evil—and idiocy,”  The Atlantic (Dec. 2015), pp.26-28.