Friday, March 25, 2016

Nuclear Safety Culture Problem at TVA: NRC Issues Chilling Effect Letter to Watts Bar

Watts Bar  Source: Wikipedia
The U.S. Nuclear Regulatory Commission (NRC) recently sent a “chilling effect letter”* (CEL) to the Tennessee Valley Authority (TVA) over NRC’s belief that reactor operators at TVA’s Watts Bar plant do not feel free to raise safety concerns because they fear retaliation and do not feel their concerns are being addressed.  The NRC questions whether the plant’s corrective action program (CAP) and Employee Concerns Program have been effective at identifying and resolving the operators’ concerns.  In addition, NRC is concerned that plant management is exercising undue influence over operators’ activities thereby compromising a safety-first environment in the control room.

TVA officials must respond to the NRC within 30 days with a plan describing how they will address the issues identified in the CEL.

What’s a Chilling Effect Letter?

“CELs are issued when the NRC has concluded that the work environment is “chilled,” (i.e., workers perceive that the licensee is suppressing or discouraging the raising of safety concerns or is not addressing such concerns when they are raised).”**

Our Perspective

The absence of fear of retaliation is the principal attribute of an effective safety conscious work environment (SCWE) which in turn is an important component of a strong safety culture (SC).  Almost all commercial nuclear plants in the U.S. have figured out how to create and maintain an acceptable SCWE.

TVA appears to be an exception and a slow learner.  This is not a new situation for them.  As the CEL states, “a Confirmatory Order (EA-09-009, EA-09-203) remains in effect to confirm commitments made by TVA for all three [emphasis added] nuclear stations to address past SCWE issues.”

We have reported multiple times on long-standing SC problems at another TVA plant, Browns Ferry.  And, as we posted on Apr. 25, 2014, Browns Ferry management even made a presentation on their SC improvement actions at the 2014 NRC Regulatory Information Conference.

NRC raised questions about the Watts Bar CAP.  As we have long maintained, CAP effectiveness (promptly responding to identified issues, accurately characterizing them and permanently fixing them) is a key artifact of SC and a visible indicator of SC strength.

As regular readers know, we believe executive compensation is another indicator of SC.  The recipient of the CEL is TVA’s Chief Nuclear Officer (CNO).  According to TVA’s most recent SEC 10-K,*** the CNO made about $2.1 million in FY 2015.  Almost $1 million of the total was short-term (annual) and long-term incentive pay.  The components of the CNO’s annual incentive plan included capability factor, forced outage rate, equipment reliability and budget performance—safety is not mentioned.****  The long-term plan included the wholesale rate excluding fuel, load not served and external measures that included an undefined “nuclear performance index.”  To the surprise of no one who follows these things, the CNO is not being specifically incentivized to create a SCWE or a strong SC.

Bottom line: This CEL is just another brick in the wall for TVA.   


*  C. Haney (NRC) to J.P. Grimes (TVA), “Chilled Work Environment for Raising and Addressing Safety Concerns at the Watts Bar Nuclear Plant” (Mar. 23, 2016) ADAMS ML16083A479.

**  D.J. Sieracki, “U.S. Nuclear Regulatory Commission Safety Culture Oversight,” IAEA  International Conference on Human and Organizational Aspects of Assuring Nuclear Safety (Feb. 24, 2016), p. 115 of “Programme and Abstracts.”

***  Tennessee Valley Authority SEC Form 10-K (annual report) for the fiscal year ended Sept. 30, 2015.  Executive compensation is discussed on pp. 152-77.

****  The calculation of the annual incentive plan payouts for named executives included a corporate multiplier based on six performance measures, one of which was safety performance based on the number of recordable injuries per hours worked, i.e., industrial safety.  The weights of the six components are not shown.

Thursday, March 17, 2016

IAEA Nuclear Safety Culture Conference

The International Atomic Energy Agency (IAEA) recently sponsored a week-long conference* to celebrate 30 years of interest and work in safety culture (SC).  By our reckoning, there were about 75 individual presentations in plenary sessions and smaller groups; dialog sessions with presenters and subject matter experts; speeches and panels; and over 30 posters.  It must have been quite a circus.

We cannot justly summarize the entire conference in this space but we can highlight material related to SC factors we’ve emphasized or people we’ve discussed on Safetymatters, or interesting items that merit your consideration.

Topics We Care About

A Systems Viewpoint

Given that the IAEA has promoted a systemic approach to safety and it was a major conference topic it’s no surprise that many participants addressed it.  But we were still pleased to see over 30 presentations, posters and dialogues that included mention of systems, system dynamics, and systemic and/or holistic viewpoints or analyses.  Specific topics covered a broad range including complexity, coupling, Fukushima, the Interaction between Human, Technical and Organizational Factors (HTOF), error/incident analysis, regulator-licensee relationships, SC assessment, situational adaptability and system dynamics.

Role of Leadership

Leadership and Management for Safety was another major conference topic.  Leadership in a substantive context was mentioned in about 20 presentations and posters, usually as one of multiple success factors in creating and maintaining a strong SC.  Topics included leader/leadership commitment, skills, specific competences, attributes, obligations and responsibilities; leadership’s general importance, relationship to performance and role in accidents; and the importance of leadership in nuclear regulatory agencies. 

Decision Making

This was mentioned about 10 times, with multiple discussions of decisions made during the early stages of the Fukushima disaster.  Other presenters described how specific techniques, such as Probabilistic Risk Assessment and Human Reliability Analysis, or general approaches, such risk control and risk informed, can contribute to decision making, which was seen as an important component of SC.

Compensation and Rewards

We’ve always been clear: If SC and safety performance are important then people from top executives to individual workers should be rewarded (by which we mean paid money) for doing it well.  But, as usual, there was zero mention of compensation in the conference materials.  Rewards were mentioned a few times, mostly by regulators, but with no hint they were referring to monetary rewards.  Overall, a continuing disappointment.   

Participants Who Have Been Featured in Safetymatters

Over the years we have presented the work of many conference participants to Safetymatters readers.  Following are some familiar names that caught our eye.
  Page numbers refer to the conference “Programme and Abstracts” document.
 
We have to begin with Edgar Schein, the architect of the cultural construct used by almost everyone in the SC space.  His discussion paper (p. 47) argued that the SC components in a nuclear plant depend on whether the executives actually create the climate of trust and openness that the other attributes hinge on.  We’ve referred to Schein so often he has his own label on Safetymatters.

Mats Alvesson’s presentation
(p. 46) discussed “hyper culture,” the vague and idealistic terms executives often promote that look good in policy documents but seldom work well in practice.  This presentation is consistent with his article on Functional Stupidity which we reviewed on Feb. 23, 2016.

Sonja Haber’s paper (p. 55) outlined a road map for the nuclear community to move forward in the way it thinks about SC.  Dr. Haber has conducted many SC assessments for the Department of Energy that we have reviewed on Safetymatters. 

Ken Koves of INPO led or participated in three dialogue sessions.  He was a principal researcher in a project that correlated SC survey data with safety performance measures which we reviewed on Oct. 22, 2010 and Oct. 5, 2014.

Najmedin Meshkati discussed (p. 60) how organizations react when their control systems start to run behind environmental demands using Fukushima as an illustrative case.  His presentation draws on an article he coauthored comparing the cultures at TEPCO’s Fukushima Daiichi plant and Tohoku Electric’s Onagawa plant which we reviewed on Mar. 19, 2014.

Jean-Marie Rousseau co-authored a paper (p. 139) on the transfer of lesson learned from accidents in one industry to another industry.  We reviewed his paper on the effects of competitive pressures on nuclear safety management issues on May 8, 2013.

Carlo Rusconi discussed (p. 167) how the over-specialization of knowledge required by decision makers can result in pools of knowledge rather than a stream accessible to all members of an organization.  A systemic approach to training can address this issue.  We reviewed Rusconi’s earlier papers on training on June 26, 2013 and Jan. 9, 2014.

Richard Taylor’s presentation (p. 68) covered major event precursors and organizations’ failure to learn from previous events.  We reviewed his keynote address at a previous IAEA conference where he discussed using system dynamics to model organizational archetypes on July 31, 2012.

Madalina Tronea talked about (p. 114) the active oversight of nuclear plant SC by the National Commission for Nuclear Activities Control (CNCAN), the Romanian regulatory authority.  CNCAN has developed its own model of organizational culture and uses multiple methods to collect information for SC assessment.  We reviewed her initial evaluation guidelines on Mar. 23, 2012

Our Perspective

Many of the presentations were program descriptions or status reports related to the presenter’s employer, usually a utility or regulatory agency.  Fukushima was analyzed or mentioned in 40 different papers or posters.  Overall, there were relatively few efforts to promote new ideas, insights or information.  Having said that, following are some materials you should consider reviewing.

From the conference participants mentioned above, Haber’s abstract (p. 55) and Rusconi’s abstract (p. 167) are worth reading.  Taylor’s abstract (p. 68) and slides are also worth reviewing.  He advocates using system dynamics to analyze complicated issues like the effectiveness of organizational learning and how events can percolate through a supply chain.

Benoît Bernard described the Belgian regulator’s five years of experience assessing nuclear plant SC.  Note that lessons learned are described in his abstract (p. 113) but are somewhat buried in his presentation slides.

If you’re interested in a systems view of SC, check out Francisco de Lemos’ presentation
(p. 63) which gives a concise depiction of a complex system plus a Systems Theoretic Accident Models and Processes (STAMP) analysis.  His paper is based on Nancy Leveson’s work which we reviewed on Nov. 11, 2013.

Diana Engström argued that nuclear personnel can put more faith in reported numbers than justified by the underlying information, e.g., CAP trending data, and thus actually add risk to the overall system.  We’d call this practice an example of functional stupidity although she doesn’t use that term in her provocative paper.  Both her abstract (p. 126) and slides are worth reviewing.

Jean Paries gave a talk on the need for resilience in the management of nuclear operations.  The abstract (p. 228) is clear and concise; there is additional information in his slides but they are a bit messy.

And that’s it for this installment.  Be safe.  Please don’t drink and text.



*  International Atomic Energy Agency, International Conference on Human and Organizational Aspects of Assuring Nuclear Safety: Exploring 30 years of Safety Culture (Feb. 22–26, 2016).  This page shows the published conference materials.  Thanks to Madalina Tronea for publicizing them.  Dr. Tronea is the founder/moderator of the LinkedIn Nuclear Safety Culture discussion group. 

Thursday, March 10, 2016

Leadership and Safety Culture

Cover of the first issue
It’s an election year in America and voters are assessing candidates who all claim they can provide the leadership the country needs.  A recent article* in The New Yorker offers a primer on the nature of leadership.  The article is engaging because we talk a lot about leadership in the nuclear industry in areas ranging from general management to molding or influencing culture.**  Following are some highlights from the article.

For starters, leadership can mean different things to different people.  The article cites a professor who found more than 200 definitions in the modern leadership literature.  Of necessity, the author focused on a small subset of the literature, starting with sociologist Max Weber who distinguished between “charismatic” and “bureaucratic” leadership.

The charismatic model is alive and well; it’s reflected in the search for CEOs with certain traits, e.g., courage, decisiveness, intelligence or attractiveness, especially during periods of perceived crisis.  Unfortunately, the track record of such people is mixed; according to one researcher, “The most powerful factor determining a company’s performance is the condition of the market in which it operates.” (p. 67)

The bureaucratic model focuses on process, i.e., what a leader actually does.  Behaviors might include gathering information on technology and competitors, setting goals, assembling teams and tracking progress, in other words, the classic plan, organize, staff, direct and control paradigm.  But a CEO candidate’s actual process might not be visible or not what he says it is.  And, in our experience, if the CEO cannot bring strategic insight or a robust vision to the table, the “process” is a puerile exercise.

So how does one identify the right guy or gal?  Filtering is one method to reduce risk in the leader selection process.  Consider the nuclear industry’s long infatuation with admirals.  Why?  One reason is they’ve all jumped through the same hoops and tend to be more or less equally competent—a safe choice but one that might not yield out-of-the-ballpark results.  A genuine organizational crisis might call for an unfiltered leader, an outsider with a different world view and experience, who might deliver a resounding success (e.g., Abraham Lincoln).  Of course, the downside risk is the unfiltered leader may fail miserably.

If you believe leadership is learnable, you’re in luck; there is a large industry devoted to teaching would-be leaders how to empower and inspire their colleagues and subordinates, all the while evidencing a set of pious virtues.  However, one professor thinks this is a crock and what the leadership industry actually does is “obscure the degree to which companies are poorly and selfishly run for the benefit of the powerful people in charge.” (p. 68)

The author sees hope in approaches that seek to impart more philosophy or virtue to leaders.  He reviews at length the work of Elizabeth Samet, an English professor at the U.S. Military Academy (West Point).  She presents leadership through a wide-angle lens, from General Grant’s frank memoirs to a Virginia Woolf essay.  To gain insight into ambition, her students read “Macbeth.”  (Ooops!  I almost typed “MacTrump.”)    

Our Perspective

The New Yorker article is far from a complete discussion of leadership but it does spur one to think about the topic.  It’s worth a quick read and some of the author’s references are worth additional research.  If you want to skip all that, what you should know is “. . . leaders in formal organizations have the power and responsibility to set strategy and direction, align people and resources, motivate and inspire people, and ensure that problems are identified and solved in a timely manner.”***

At Safetymatters, we believe effective leadership is necessary, but not sufficient, to create a strong safety culture (SC).  Not all aspects of leadership are important in the quest for a strong SC.  Leaders need some skills, e.g., the ability to communicate their visions, influence others and create shared understanding.  But the critical aspects are decision-making and role modeling.

Every decision the leader makes must show respect for the importance of safety.  The people will be quick to spot any gap between words and decisions.  Everyone knows that production, schedule and budget are important—failure to perform eventually means jobs and careers go away—but safety must always be a conscious and visible consideration.

Being a role model is also important.  Again, the people will spot any disregard or indifference to safety considerations, rules or practices.

There is no guarantee that even the most gifted leader can deliver a stronger SC.  Although the leader may create a vision for strong SC and attempt to direct behavior toward that vision, the dynamics of SC are complex and subject to multiple factors ranging from employees’ most basic values to major issues that compete for the organization’s attention and resources. 

To close on a more upbeat note, effective leadership is open to varying definitions and specifications but, to borrow former Supreme Court Justice Potter Stewart’s famous phrase, we know it when we see it.****


*  J. Rothman, “Shut Up and Sit Down,” The New Yorker (Feb. 29, 2016), pp. 64-69.

**  For INPO, leadership is sine qua non for an effective nuclear organization.

***  This quote is not from The New Yorker article.  It is from a review of SC-related social science literature that we posted about on Feb. 10, 2013.

****  Justice Stewart was talking about pornography but the same sort of Kantian knowing can be applied to many topics not amenable to perfect definition.

Thursday, March 3, 2016

2016 NEA Report on Fukushima Lessons Learned

Five years after the Fukushima disaster, the Nuclear Energy Agency (NEA) has released an updated report* on Fukushima lessons learned.  It summarizes NEA and member country safety improvements and corrective actions, including “efforts to understand and characterise the importance of strong nuclear safety cultures . . .” (p. 3)

Keep in mind that countries (not plant operators) comprise the NEA so safety culture (SC) discussion centers on government, i.e., regulatory, activities.  Selected SC-related excerpts from the report follow:

“Several NEA member countries have adopted a broad consideration of safety culture characteristics, including human and organisational factors, which include specific safety culture programmes that focus on attitudes towards safety, organisational capability, decision-making processes [including during emergencies] and the commitment to learn from experience.” (p. 11)

“Some [countries] have adopted a systematic consideration of safety culture characteristics in inspection and oversight processes. . . . These include periodic internal and external safety culture assessments.” (p. 29)

Desirable SC characteristics for a regulator (as opposed to a licensee) are discussed on pp. 40-42.  That may seem substantial but it’s all pulled from a different 2016 NEA publication, “The Safety Culture of an Effective Nuclear Regulatory Body,” which we reviewed on Feb. 10, 2016.  That publication had one point worth repeating here, viz., the regulator, in its efforts to promote and ensure safety, should think holistically about the overall regulator-licensee- socio-technical-legal-political system in terms of causes and effects, feedback loops and overall system performance. 

Our Perspective

This report may be a decent high-level summary of activities undertaken around the world but it is not sufficiently detailed to provide guidance and it certainly contains no original analysis.  The report does include a respectable list of Fukushima-related references.

Many of the actions, initiatives and activities described in the report are cited multiple times, creating the impression of more content than actually exists.  For example, the quote above from p. 11 is repeated, in whole or in part, in at least four other places.

If the NEA were a person, we’d characterize it as an “empty suit.”  While the summaries of and excerpts from the references, meetings, etc. are satisfactory, the NEA-authored top-level observations are often pro-nuclear cheerleading or just plain blather, e.g., “NEA member countries have continued to take appropriate actions to maintain and enhance the level of safety at their nuclear facilities, and thus nuclear power plants are safer now because of actions taken since the accident.  Ensuring safety is a continual process, . . .” (p. 11)**


*  Nuclear Energy Agency, “Five Years after the Fukushima Daiichi Accident: Nuclear Safety Improvements and Lessons Learnt,” NEA No. 7284 (2016).  The NEA is an arm of the Organisation for Economic Co-operation and Development (OECD).  This report builds on a 2013 report, “The Fukushima Daiichi Nuclear Power Plant Accident: OECD/NEA Nuclear Safety Response and Lessons Learnt.”

**  As a catty aside, the reputation of the NEA’s relatively new Director-General doesn’t exactly contribute to the agency’s respectability, his having been called “a treacherous, miserable liar,” “first-class rat” and “a tool of the nuclear industry” by an influential U.S. Senator during a 2012 Huffington Post interview.  At that time, the Director-General was a U.S. Nuclear Regulatory Commissioner.

Tuesday, February 23, 2016

The Dark Side of Culture Management: Functional Stupidity

Culture is the collection of values and assumptions that underlie organizational decisions and other actions.  We have long encouraged organizations to develop strong safety cultures (SC).  The methods available to do this are widely-known, including visible leadership and role models; safety-related policies, practices and procedures; supportive structures like an Employee Concerns Program; the reward and recognition system; training and oversight; and regulatory carrots and sticks.

Because safety performance alone does not pay the bills, organizations also need to achieve their intended economic goals (i.e., be effective) and operate efficiently.  Most of the methods that can be used to promote SC can also be used to promote the overall performance culture.

What happens when the organization goes too far in shaping its culture to optimize performance?  One possibility, according to a 2012 Journal of Management Studies article*, is a culture of Functional Stupidity.  The Functional part means the organization meets its goals and operates efficiently and Stupidity “is an organizationally supported inability or unwillingness to mobilize one’s cognitive capacities.” (p. 1199)**

More specifically, to the extent management, through its power and/or leadership, willfully shapes an organization’s value structure to achieve greater functionality (conformity, focus, efficiency, etc.) they may be, consciously or unconsciously, creating an environment where employees ask fewer questions (and no hard ones), seek fewer justifications for the organization’s decisions or actions, focus their intelligence in the organization’s defined areas, do not reflect on their roles in the organization’s undertakings, and essentially go along with the program.  Strong leaders set the agenda and the true followers, well, they follow.

In the name of increased functionality, such actions can create a Weltanschauung that is narrowly focused and self-justifying.  It may result in soft biases, e.g., production over safety, or ignoring problematic aspects of a situation, e.g., Davis-Besse test and inspection reports.

Fortunately, as the authors explain, a self-correcting dynamic may occur.  Initially, improved functionality contributes to a sense of certainty about the organization’s and individuals’ places in the world, thus creating positive feedback.  But eventually the organization’s view of the world may increasingly clash with reality, creating dissonance (a loss of certainty) for the organization and the individuals who inhabit it.  As the gap between perception and reality grows, the overall system becomes less stable.  When people realize that description and reality are far apart, the organization’s, i.e., management’s, legitimacy collapses.

However, in the worst case “increasingly yawning gaps between shared assumptions and reality may eventually produce accidents or disasters.” (p. 1213)  Fukushima anyone?

Our Perspective

Management is always under the gun to “do better” when things are going well or “do something” when problems occur.  In the latter case, one popular initiative is to “improve” the culture, especially if a regulator is involved.  Although management’s intentions may be beneficent, there is an opportunity for invidious elements to be introduced and/or unintended consequences to occur.

Environmental factors can encourage stupidity.  For example, quarterly financial reporting, an ever shortening media cycle and the global reach of the Internet (especially it’s most intellectually challenged component, the Twitterverse) pressure executives to project command of their circumstances and certainty about their comprehension, even if they lack adequate (or any) relevant data.

The nuclear industry is not immune to functional stupidity.  Not to put too fine a point on it, but the industry's penchant for secrecy creates an ideal Petri dish for the cultivation of stupidity management.

The authors close by saying “we hope to prompt wider debate about why it is that smart organizations can be so stupid at times.” (p. 1216)  For a long time we have wondered about that ourselves.


*  M. Alvesson and A. Spicer, “A Stupidity-Based Theory of Organizations,” Journal of Management Studies 49:7 (Nov. 2012), pp. 1194-1220.  Thanks to Madalina Tronea for publicizing this document.  Dr. Tronea is the founder/moderator of the LinkedIn Nuclear Safety Culture discussion group.

**  Following are additional definitions, with italics added, “functional stupidity is inability and/or unwillingness to use cognitive and reflective capacities in anything other than narrow and circumspect ways.” (p. 1201)  “stupidity management . . . involves the management of consciousness, clues about how to understand and relate to the world, . . .” (p. 1204)  “stupidity self-management  [consists of] the individual putting aside doubts, critique, and other reflexive concerns and focusing on the more positive aspects of organizational life which are more clearly aligned with understandings and interpretations that are officially sanctioned and actively promoted.” (p. 1207)

Tuesday, February 16, 2016

DOE Inspector General Piles On: Bechtel CAP and DOE Oversight Deficient at the Vit Plant

The Department of Energy (DOE) Inspector General (IG) recently released an audit report* on deficiencies in Bechtel’s Corrective Action Program (CAP) at the Hanford Waste Treatment Plant (WTP aka the Vit Plant) where Bechtel is the prime contractor.  The report also described deficiencies in the DOE Office of River Protection’s (ORP) oversight of Bechtel.

With one exception, this IG report is not about safety culture (SC) per se, but it does discuss two key artifacts that reflect the strength of a SC: the effectiveness of the CAP and the size of backlogs.**

The audit found that the Bechtel CAP “was not fully effective in managing and resolving issues.”  Specifically, some required issues were not managed and tracked in the CAP, corrective actions were not implemented in a timely manner (Bechtel did not make any of its timeliness goals) and Bechtel failed to follow through on implementing or sustaining prior CAP improvement initiatives. (pp. 1-2, 5)

The findings were not news to the ORP.  In fact, they are consistent with ORP’s 2013 audit of Bechtel’s Quality Assurance program.  At that time ORP directed Bechtel to make CAP improvements but as of the current IG audit, such improvements had not been fully implemented. (p. 2) 

CAP backlogs are also a problem.  Backlogs of condition reports increased from 2013 to 2014, as did the age of corrective actions. (pp. 4-5)

The audit report does have one direct tie to SC, noting that Bechtel identified weaknesses in its SC in 2014, including concerns about management not valuing a rigorous CAP. (p. 6)

And the auditors didn’t let ORP off the hook, stating DOE “did not ensure that all technical issues and issues identified through self-assessments were entered into the [CAP].  Finally, [DOE] did not ensure that previous Bechtel initiatives to address [DOE] implementation problems were fully implemented or sustained.” (p. 6)

The report closed with three straightforward “fix-it” recommendations with which ORP management concurred.  In their concurrence letter, ORP reviews the actions taken to date and concludes “Bechtel has strengthened the WTP Project’s nuclear safety and quality culture.” (p. 11)

Our Perspective

The report does not inspire confidence that Bechtel can upgrade its CAP (while trying to move ahead with Vit Plant design and construction) or ORP will ride herd on them to ensure it happens.  In fact, the report is consistent with a bevy of earlier assessments and evaluations, many of which we have reviewed on Safetymatters.  (Click on the Vit Plant label for more details.)  ORP’s assertion that Bechtel has strengthened its culture is possibly true, but they began from an unacceptably low starting point.

Early in my career I was hired as a Quality Control manager for a telecom manufacturer.  The company had major problems with its flagship product and I was soon named to a task force to investigate them.  On my way to our initial meeting, I met up with a more senior employee and told him how I looked forward to our task force identifying and fixing the product’s problems.  He turned to me and said “The first three didn’t.”  Welcome to the world.


*  U.S. DOE Inspector General, “Audit Report - Corrective Action Program at the Waste Treatment and Immobilization Plant,” OAI-M-16-06 (Feb. 2016).

**  As we have discussed elsewhere, two other key artifacts are decision-making and compensation.  From the WTP history we have reviewed for Safetymatters, it appears Bechtel (and by extension, DOE) decision-making does not effectively address either the tough technical challenges or programmatic issues at the WTP.  The Bechtel contract now includes some modest incentive compensation for SC performance.  We discussed that program on Dec. 29, 2014.

Wednesday, February 10, 2016

NEA’s Safety Culture Guidance for Nuclear Regulators

A recent Nuclear Energy Agency (NEA) publication* describes desirable safety culture (SC) characteristics for a nuclear regulator.  Its purpose is to provide a benchmark for both established and nascent regulatory bodies.

The document’s goal is to describe a “healthy” SC.  It starts with the SC definition in INSAG-4** then posits five principles for an effective nuclear regulator: Safety leadership is demonstrated at all levels; regulatory staff set the standard for safety; and the regulatory body facilitates co-operation and open communication, implements a holistic approach to safety, and encourages continuous improvement, learning and self-assessment.

The principle that caught our attention is the holistic (or systemic) approach to safety.  This approach is discussed multiple times in the document.  In the Introduction, the authors say the regulator
should actively scrutinise how its own safety culture impacts the licensees’ safety culture.  It should also reflect on its role within the wider system and on how its own culture is the result of its interactions with the licensees and all other stakeholders.” (p. 12)

A subsequent chapter contains a more expansive discussion of each principle and identifies relevant attributes.  The following excerpts illustrate the value of a holistic approach.  “A healthy safety culture is dependent on the regulatory body using a robust, holistic, multi-disciplinary approach to safety.  Regulators oversee and regulate complex socio-technical systems that, together with the regulatory body itself, form part of a larger system made up of many stakeholders, with competing as well as common interests.  All the participants in this system influence and react to each other, and there is a need for awareness and understanding of this mutual influence.” (p. 19)

“[T]he larger socio-technical system [is] influenced by technical, human and organisational, environmental, economic, political and societal factors [including national culture].  Regulators should strive to do more than simply establish standards; they should consider the performance of the entire system that ensures safety.” (p. 20)

And “Safety issues are complex and involve a number or inter-related factors, activities and groups, whose importance and effect on each other and on safety might not be immediately recognisable.” (ibid.)

The Conclusions include the following: “Regulatory decisions need to consider the performance and response of the entire system delivering safety, how the different parts of the system are coupled and the direction the system is taking.” (p. 28)

Our Perspective

Much of this material in this publication will be familiar to Safetymatters readers*** but the discussion of a holistic approach to regulation is more extensive than we’ve seen elsewhere.  For that reason alone, we think this document is worth your quick review.  We have been promoting a systems view of the nuclear industry, from individual power plants to the overall socio-technical-legal-political construct, for years. 

The committee that developed the guidance consisted of almost thirty members from over a dozen countries, the International Atomic Energy Agency and NEA itself.  It’s interesting that China was not represented on the committee although it has world's largest nuclear power plant construction program**** and, one would hope, substantial interest in effective safety regulation and safety culture.  (Ooops!  China is not a member of the NEA.  Does that say something about China's perception of the NEA's value proposition?)


*  Nuclear Energy Agency, “The Safety Culture of an Effective Nuclear Regulatory Body” (2016).  Thanks to Madalina Tronea for publicizing this document.  Dr. Tronea is the founder/moderator of the LinkedIn Nuclear Safety Culture discussion group.  The NEA is an arm of the Organisation for Economic Co-operation and Development (OECD).

**  International Nuclear Safety Advisory Group, “Safety Culture,” Safety Series No. 75-INSAG-4, (Vienna: IAEA, 1991), p. 4.

***  For example, the list of challenges a regulator faces includes the usual suspects: maintain the focus on safety, avoid complacency, resist external pressures, avoid regulatory capture and maintain technical competence. (pp. 23-25)

****  “China has world's largest nuclear power capacity under construction,” China Daily (Dec. 30, 2015).

Tuesday, February 2, 2016

Ethics, Individual Misconduct and Organizational Culture

Ethics* are rules for conduct or behavior.  They are basically a social (as opposed to an individual or psychological) construct and part of group or organizational culture.  They can be very specific do’s and don’ts or more general guidelines for behavior.

The Ethics Resource Center (ERC) conducts and publishes regular employee surveys on the degree of non-compliance with workplace ethics in the United States.  The surveys focus on instances of misconduct observed by workers—what occurred, who did it, who reported it (if anyone) and what happened to the reporter. 

The survey uses a random sample of employees in the for-profit sector.  The 2013 survey** ended up with over 6,000 useable responses.  There is no indication how many respondents, if any, work in the commercial nuclear industry.

The overall findings are interesting.  On the positive side, “Observed misconduct*** is down for the third report in a row and is now at a historic low; the decline in misconduct is widespread; and the percentage of workers who said they felt pressure to compromise standards also fell substantially.” (p.12)

But problems persist.  “Workers reported that 60 percent of misconduct involved someone with managerial authority from the supervisory level up to top management.  Nearly a quarter (24 percent) of observed misdeeds involved senior managers.  Perhaps equally troubling, workers said that 26 percent of misconduct is ongoing within their organization.  About 12 percent of wrongdoing was reported to take place company-wide.” (ibid.)

The reporting of misconduct problems has both good and bad news.  Lots of workers (63%) who observed misconduct reported it but 21% of those who reported misconduct said they experienced retaliation**** in return. (p. 13)

The report goes on to examine the details behind the summary results and attempts to assign some possible causes to explain observed trends.  For example, the authors believe it’s probable that positive trends are related to companies’ ethics and compliance programs that create new norms for worker conduct, i.e., a stronger culture. (p. 16)  And a stronger culture is desirable.  Returning to the survey, “In 2013, one in five workers (20 percent) reported seeing misconduct in companies where cultures are “strong” compared to 88 percent who witnessed wrongdoing in companies with the weakest cultures.” (p. 18)

The keys to building a stronger ethical culture are familiar to Safetymatters readers: top-level role models, and support by immediate supervisors and peers to do the right thing.  In terms of cultural artifacts, a stronger ethical culture is visible in an organization’s processes for training, personnel evaluation and application of employee discipline. 

The report goes on to analyze misconduct in depth—who is doing it, what are they doing and how long it has been going on.  The authors cover how and why employees report misconduct and suggest ways to increase the reporting rate.  They note that increased legal protection for whistleblowers has increased the likelihood that covered workers will report misconduct.

Our Perspective

This report is worth a read.  Quite frankly, more workers are willing to report misconduct than I would have predicted.  The percentage of reporters who perceive retaliation is disappointing but hardly surprising.
 

The survey results are more interesting than the explanatory analysis; a reader should keep in mind that this research was conducted by a group that has a vested self-interest in finding the "correct" answers. 

Because specific firms and industries are not identified, it’s easy to blow off the results with a flip “Didn’t happen here and can’t happen here because we have a robust SCWE and ECP.”  I suggest such parochial reviewers keep in mind that “Pride goes before destruction, and a haughty spirit before a fall..”*****


*  Ethics and morals are often used interchangeably but it’s helpful to consider morals as an individual construct, a person’s inner principles of right and wrong.  See diffen.com for a more detailed comparison.

**  Ethics Resource Center, “2013 National Business Ethics Survey of the U.S. Workforce” (Arlington, VA: 2014).  Corporate sponsors include firms familiar to nuclear industry participants, e.g., Bechtel and Edison International.

***  The survey identified 28 specific types of misconduct.  Some of interest to the nuclear industry, listed in the order of frequency of occurrence in the survey responses, include abusive behavior or behavior that creates a hostile work environment, lying to employees, discriminating against employees, violations of health or safety regulations, lying to the public, retaliation against someone who has reported misconduct, abusing substances at work, sexual harassment, violation of environmental regulations and falsifying books and/or records. (pp. 41-42)

****  The survey also identified 13 specific types of retaliation experienced by whistleblowers including being ignored or treated differently by supervisors or other employees, being excluded from decisions, verbal abuse, not receiving promotions or raises, reduced hours or pay, relocation or reassignment, harassment at home or online and physical harm to one’s person or property. (p. 45)

*****  Proverbs 16:18, Bible (English Standard Version).