Showing posts with label References. Show all posts
Showing posts with label References. Show all posts

Monday, August 1, 2016

Nuclear Safety Culture Self-Assessment Guidance from IAEA

IAEA report cover
The International Atomic Energy Agency (IAEA) recently published guidance on performing safety culture (SC) self-assessments (SCSAs).  This post summarizes the report* and offers our perspective on its usefulness.

The Introduction presents some general background on SC and specific considerations to keep in mind when conducting an SCSA, including a “conscious effort to think in terms of the human system (the complex, dynamic interaction of individuals and teams within an organization) rather than the technological system.” (p. 2)  Importantly, an SCSA is not based on technical skills or nuclear technology, nor is it focused on immediate corrective actions for observed problems.

Section 2 provides additional information on SC, starting with the basics, e.g., culture is one way of explaining why things happen in organizations.  The familiar iceberg model is presented, with the observable artifacts above the surface and the national, ethnic and religious values that underlie culture way below the waterline.  Culture is robust (it cannot be changed rapidly) and complicated (subcultures exist).  So far, so good.

Then things start to go off the rails.  The report reminds us that the IAEA SC framework** has five SC characteristics but then the report introduces, with no transition, a four-element model for envisioning SC; naturally, the model elements are different from the five SC characteristics previously mentioned.  The report continues with a discussion of IAEA’s notion of “shared space,” the boundary area where working relationships develop between the individual and other organizational members.  We won’t mince words: the four-component model and “shared space” are a distraction and zero value-added.

Section 3 explores the characteristics of SCSAs.  Initially, an SCSA focuses on developing an accurate description of the current culture, the “what is.”  It then moves on to evaluating a SC’s strengths and weaknesses by comparing “what is” with “what should be.”  An SCSA is different from a typical audit in numerous ways, including the need for specialized training, a focus on organizational dynamics and an understanding of the complex interplay of multicultural dimensions of the organization.

SCSAs require recognition of the biases present when a culture examines itself.  Coupling this observation with an earlier statement that effective SCSAs require understanding of the relevant social sciences, the report recommends obtaining qualified external support personnel (at least for the initial efforts at conducting SCSAs).  In addition, there are many risks (the report comes up with 17) associated with performing an SCSA that have to be managed.  All of these aspects are important and need to be addressed.

Section 4 describes the steps in performing an SCSA.  The figure that purportedly shows all the steps is unapproachable and unintelligible.  However, the steps themselves—prepare the organization, the team and the SCSA plan; conduct the pre-launch and the SCSA; analyze the results; summarize the communicate the findings; develop actions; capture lessons learned; and conduct a follow-up—are reasonable.

The description of SCSA team composition, competences and responsibilities is also reasonable.  Having a team member with a behavioral science background is highly desirable but probably not available internally in other than the largest organizations. 

Section 5 covers SCSA methods: document review, questionnaires, observations, focus groups and interviews.  For each method, the intent, limitations and risks, and intended uses are discussed.  Each method requires specific skills.  The purpose is to develop an overall view of the culture.  Because of the limitations of individual methods, multiple (and preferably all) methods should be used.  Overall, this section is a pretty good high-level description of the different investigative methods.

Section 6 describes how to perform an integrated analysis of the information gathered.  This involves working iteratively with parallel information sets.  There is a lengthy discussion of how to develop cultural themes from the different data sources.  Themes are combined into an overall descriptive view of the culture which can then be compared to the IAEA SC framework (a normative view) to identify relative strengths and weaknesses, and improvement opportunities.

Section 7 describes approaches to communicating the findings and transitioning into action.  It covers preparing the SCSA report, communicating the results to management and the larger organization, possible barriers to implementing improvement initiatives and maintaining continuous improvement in an organization’s SC.

The report has an extensive set of appendices that illustrate how an SCSA can be conducted.  Appendix I is a laundry list of potential areas for inquiry.  Appendices II-VIII present a case study using all the SCSA methods in Section 5, followed by some example overall conclusions.  Appendix IX is an outline of an SCSA final report.  The guidance on using the SCSA methods is acceptably complete and clear.

A 28-page Annex (including 8 pages of references) describes the social science underlying the recommended methodology for performing SCSAs.  It covers too much ground to be summarized here.  The writing is uneven, with some topics presented in a fluid style (probably a single voice) while others, especially those referring to many different sources, are more ragged.  Because of the extensive use of in-line references, the reader can easily identify source materials.   

Our Perspective

There’s good news and bad news in this Safety Report.  The good news is that when IAEA collates and organizes the work of others, e.g., academics, SC practitioners or industry best practices, IAEA can create a readable, reasonably complete reference on a subject, in this case, SCSA.

The bad news is that when IAEA tries to add new content with their own concepts, constructs, figures and such, they fail to add any value.  In fact, they detract from the total package.  It seems to never have occurred to the IAEA apparatchiks to circulate their ideas for new content for substantive review and comment.


*  International Atomic Energy Agency, “Performing Safety Culture Self-assessments,” Safety Reports Series no. 83 (Vienna: IAEA, 2016).  Thanks to Madalina Tronea for publicizing this report.  Dr. Tronea is the founder/moderator of the LinkedIn Nuclear Safety Culture discussion group.

**  Interestingly, the IAEA SC framework (SC definition, key characteristics and attributes) is mentioned without much discussion; the reader is referred to other IAEA documents for more details.  That’s OK.  For purposes of SCSA, it’s only important that the organization, including the SCSA team, agree on a SC definition and its associated characteristics and attributes.  This will give everyone involved a shared normative view for linking the SCSA findings to a picture of what the SC should look like.

Thursday, March 10, 2016

Leadership and Safety Culture

Cover of the first issue
It’s an election year in America and voters are assessing candidates who all claim they can provide the leadership the country needs.  A recent article* in The New Yorker offers a primer on the nature of leadership.  The article is engaging because we talk a lot about leadership in the nuclear industry in areas ranging from general management to molding or influencing culture.**  Following are some highlights from the article.

For starters, leadership can mean different things to different people.  The article cites a professor who found more than 200 definitions in the modern leadership literature.  Of necessity, the author focused on a small subset of the literature, starting with sociologist Max Weber who distinguished between “charismatic” and “bureaucratic” leadership.

The charismatic model is alive and well; it’s reflected in the search for CEOs with certain traits, e.g., courage, decisiveness, intelligence or attractiveness, especially during periods of perceived crisis.  Unfortunately, the track record of such people is mixed; according to one researcher, “The most powerful factor determining a company’s performance is the condition of the market in which it operates.” (p. 67)

The bureaucratic model focuses on process, i.e., what a leader actually does.  Behaviors might include gathering information on technology and competitors, setting goals, assembling teams and tracking progress, in other words, the classic plan, organize, staff, direct and control paradigm.  But a CEO candidate’s actual process might not be visible or not what he says it is.  And, in our experience, if the CEO cannot bring strategic insight or a robust vision to the table, the “process” is a puerile exercise.

So how does one identify the right guy or gal?  Filtering is one method to reduce risk in the leader selection process.  Consider the nuclear industry’s long infatuation with admirals.  Why?  One reason is they’ve all jumped through the same hoops and tend to be more or less equally competent—a safe choice but one that might not yield out-of-the-ballpark results.  A genuine organizational crisis might call for an unfiltered leader, an outsider with a different world view and experience, who might deliver a resounding success (e.g., Abraham Lincoln).  Of course, the downside risk is the unfiltered leader may fail miserably.

If you believe leadership is learnable, you’re in luck; there is a large industry devoted to teaching would-be leaders how to empower and inspire their colleagues and subordinates, all the while evidencing a set of pious virtues.  However, one professor thinks this is a crock and what the leadership industry actually does is “obscure the degree to which companies are poorly and selfishly run for the benefit of the powerful people in charge.” (p. 68)

The author sees hope in approaches that seek to impart more philosophy or virtue to leaders.  He reviews at length the work of Elizabeth Samet, an English professor at the U.S. Military Academy (West Point).  She presents leadership through a wide-angle lens, from General Grant’s frank memoirs to a Virginia Woolf essay.  To gain insight into ambition, her students read “Macbeth.”  (Ooops!  I almost typed “MacTrump.”)    

Our Perspective

The New Yorker article is far from a complete discussion of leadership but it does spur one to think about the topic.  It’s worth a quick read and some of the author’s references are worth additional research.  If you want to skip all that, what you should know is “. . . leaders in formal organizations have the power and responsibility to set strategy and direction, align people and resources, motivate and inspire people, and ensure that problems are identified and solved in a timely manner.”***

At Safetymatters, we believe effective leadership is necessary, but not sufficient, to create a strong safety culture (SC).  Not all aspects of leadership are important in the quest for a strong SC.  Leaders need some skills, e.g., the ability to communicate their visions, influence others and create shared understanding.  But the critical aspects are decision-making and role modeling.

Every decision the leader makes must show respect for the importance of safety.  The people will be quick to spot any gap between words and decisions.  Everyone knows that production, schedule and budget are important—failure to perform eventually means jobs and careers go away—but safety must always be a conscious and visible consideration.

Being a role model is also important.  Again, the people will spot any disregard or indifference to safety considerations, rules or practices.

There is no guarantee that even the most gifted leader can deliver a stronger SC.  Although the leader may create a vision for strong SC and attempt to direct behavior toward that vision, the dynamics of SC are complex and subject to multiple factors ranging from employees’ most basic values to major issues that compete for the organization’s attention and resources. 

To close on a more upbeat note, effective leadership is open to varying definitions and specifications but, to borrow former Supreme Court Justice Potter Stewart’s famous phrase, we know it when we see it.****


*  J. Rothman, “Shut Up and Sit Down,” The New Yorker (Feb. 29, 2016), pp. 64-69.

**  For INPO, leadership is sine qua non for an effective nuclear organization.

***  This quote is not from The New Yorker article.  It is from a review of SC-related social science literature that we posted about on Feb. 10, 2013.

****  Justice Stewart was talking about pornography but the same sort of Kantian knowing can be applied to many topics not amenable to perfect definition.

Tuesday, February 23, 2016

The Dark Side of Culture Management: Functional Stupidity

Culture is the collection of values and assumptions that underlie organizational decisions and other actions.  We have long encouraged organizations to develop strong safety cultures (SC).  The methods available to do this are widely-known, including visible leadership and role models; safety-related policies, practices and procedures; supportive structures like an Employee Concerns Program; the reward and recognition system; training and oversight; and regulatory carrots and sticks.

Because safety performance alone does not pay the bills, organizations also need to achieve their intended economic goals (i.e., be effective) and operate efficiently.  Most of the methods that can be used to promote SC can also be used to promote the overall performance culture.

What happens when the organization goes too far in shaping its culture to optimize performance?  One possibility, according to a 2012 Journal of Management Studies article*, is a culture of Functional Stupidity.  The Functional part means the organization meets its goals and operates efficiently and Stupidity “is an organizationally supported inability or unwillingness to mobilize one’s cognitive capacities.” (p. 1199)**

More specifically, to the extent management, through its power and/or leadership, willfully shapes an organization’s value structure to achieve greater functionality (conformity, focus, efficiency, etc.) they may be, consciously or unconsciously, creating an environment where employees ask fewer questions (and no hard ones), seek fewer justifications for the organization’s decisions or actions, focus their intelligence in the organization’s defined areas, do not reflect on their roles in the organization’s undertakings, and essentially go along with the program.  Strong leaders set the agenda and the true followers, well, they follow.

In the name of increased functionality, such actions can create a Weltanschauung that is narrowly focused and self-justifying.  It may result in soft biases, e.g., production over safety, or ignoring problematic aspects of a situation, e.g., Davis-Besse test and inspection reports.

Fortunately, as the authors explain, a self-correcting dynamic may occur.  Initially, improved functionality contributes to a sense of certainty about the organization’s and individuals’ places in the world, thus creating positive feedback.  But eventually the organization’s view of the world may increasingly clash with reality, creating dissonance (a loss of certainty) for the organization and the individuals who inhabit it.  As the gap between perception and reality grows, the overall system becomes less stable.  When people realize that description and reality are far apart, the organization’s, i.e., management’s, legitimacy collapses.

However, in the worst case “increasingly yawning gaps between shared assumptions and reality may eventually produce accidents or disasters.” (p. 1213)  Fukushima anyone?

Our Perspective

Management is always under the gun to “do better” when things are going well or “do something” when problems occur.  In the latter case, one popular initiative is to “improve” the culture, especially if a regulator is involved.  Although management’s intentions may be beneficent, there is an opportunity for invidious elements to be introduced and/or unintended consequences to occur.

Environmental factors can encourage stupidity.  For example, quarterly financial reporting, an ever shortening media cycle and the global reach of the Internet (especially it’s most intellectually challenged component, the Twitterverse) pressure executives to project command of their circumstances and certainty about their comprehension, even if they lack adequate (or any) relevant data.

The nuclear industry is not immune to functional stupidity.  Not to put too fine a point on it, but the industry's penchant for secrecy creates an ideal Petri dish for the cultivation of stupidity management.

The authors close by saying “we hope to prompt wider debate about why it is that smart organizations can be so stupid at times.” (p. 1216)  For a long time we have wondered about that ourselves.


*  M. Alvesson and A. Spicer, “A Stupidity-Based Theory of Organizations,” Journal of Management Studies 49:7 (Nov. 2012), pp. 1194-1220.  Thanks to Madalina Tronea for publicizing this document.  Dr. Tronea is the founder/moderator of the LinkedIn Nuclear Safety Culture discussion group.

**  Following are additional definitions, with italics added, “functional stupidity is inability and/or unwillingness to use cognitive and reflective capacities in anything other than narrow and circumspect ways.” (p. 1201)  “stupidity management . . . involves the management of consciousness, clues about how to understand and relate to the world, . . .” (p. 1204)  “stupidity self-management  [consists of] the individual putting aside doubts, critique, and other reflexive concerns and focusing on the more positive aspects of organizational life which are more clearly aligned with understandings and interpretations that are officially sanctioned and actively promoted.” (p. 1207)

Wednesday, February 10, 2016

NEA’s Safety Culture Guidance for Nuclear Regulators

A recent Nuclear Energy Agency (NEA) publication* describes desirable safety culture (SC) characteristics for a nuclear regulator.  Its purpose is to provide a benchmark for both established and nascent regulatory bodies.

The document’s goal is to describe a “healthy” SC.  It starts with the SC definition in INSAG-4** then posits five principles for an effective nuclear regulator: Safety leadership is demonstrated at all levels; regulatory staff set the standard for safety; and the regulatory body facilitates co-operation and open communication, implements a holistic approach to safety, and encourages continuous improvement, learning and self-assessment.

The principle that caught our attention is the holistic (or systemic) approach to safety.  This approach is discussed multiple times in the document.  In the Introduction, the authors say the regulator
should actively scrutinise how its own safety culture impacts the licensees’ safety culture.  It should also reflect on its role within the wider system and on how its own culture is the result of its interactions with the licensees and all other stakeholders.” (p. 12)

A subsequent chapter contains a more expansive discussion of each principle and identifies relevant attributes.  The following excerpts illustrate the value of a holistic approach.  “A healthy safety culture is dependent on the regulatory body using a robust, holistic, multi-disciplinary approach to safety.  Regulators oversee and regulate complex socio-technical systems that, together with the regulatory body itself, form part of a larger system made up of many stakeholders, with competing as well as common interests.  All the participants in this system influence and react to each other, and there is a need for awareness and understanding of this mutual influence.” (p. 19)

“[T]he larger socio-technical system [is] influenced by technical, human and organisational, environmental, economic, political and societal factors [including national culture].  Regulators should strive to do more than simply establish standards; they should consider the performance of the entire system that ensures safety.” (p. 20)

And “Safety issues are complex and involve a number or inter-related factors, activities and groups, whose importance and effect on each other and on safety might not be immediately recognisable.” (ibid.)

The Conclusions include the following: “Regulatory decisions need to consider the performance and response of the entire system delivering safety, how the different parts of the system are coupled and the direction the system is taking.” (p. 28)

Our Perspective

Much of this material in this publication will be familiar to Safetymatters readers*** but the discussion of a holistic approach to regulation is more extensive than we’ve seen elsewhere.  For that reason alone, we think this document is worth your quick review.  We have been promoting a systems view of the nuclear industry, from individual power plants to the overall socio-technical-legal-political construct, for years. 

The committee that developed the guidance consisted of almost thirty members from over a dozen countries, the International Atomic Energy Agency and NEA itself.  It’s interesting that China was not represented on the committee although it has world's largest nuclear power plant construction program**** and, one would hope, substantial interest in effective safety regulation and safety culture.  (Ooops!  China is not a member of the NEA.  Does that say something about China's perception of the NEA's value proposition?)


*  Nuclear Energy Agency, “The Safety Culture of an Effective Nuclear Regulatory Body” (2016).  Thanks to Madalina Tronea for publicizing this document.  Dr. Tronea is the founder/moderator of the LinkedIn Nuclear Safety Culture discussion group.  The NEA is an arm of the Organisation for Economic Co-operation and Development (OECD).

**  International Nuclear Safety Advisory Group, “Safety Culture,” Safety Series No. 75-INSAG-4, (Vienna: IAEA, 1991), p. 4.

***  For example, the list of challenges a regulator faces includes the usual suspects: maintain the focus on safety, avoid complacency, resist external pressures, avoid regulatory capture and maintain technical competence. (pp. 23-25)

****  “China has world's largest nuclear power capacity under construction,” China Daily (Dec. 30, 2015).

Tuesday, February 2, 2016

Ethics, Individual Misconduct and Organizational Culture

Ethics* are rules for conduct or behavior.  They are basically a social (as opposed to an individual or psychological) construct and part of group or organizational culture.  They can be very specific do’s and don’ts or more general guidelines for behavior.

The Ethics Resource Center (ERC) conducts and publishes regular employee surveys on the degree of non-compliance with workplace ethics in the United States.  The surveys focus on instances of misconduct observed by workers—what occurred, who did it, who reported it (if anyone) and what happened to the reporter. 

The survey uses a random sample of employees in the for-profit sector.  The 2013 survey** ended up with over 6,000 useable responses.  There is no indication how many respondents, if any, work in the commercial nuclear industry.

The overall findings are interesting.  On the positive side, “Observed misconduct*** is down for the third report in a row and is now at a historic low; the decline in misconduct is widespread; and the percentage of workers who said they felt pressure to compromise standards also fell substantially.” (p.12)

But problems persist.  “Workers reported that 60 percent of misconduct involved someone with managerial authority from the supervisory level up to top management.  Nearly a quarter (24 percent) of observed misdeeds involved senior managers.  Perhaps equally troubling, workers said that 26 percent of misconduct is ongoing within their organization.  About 12 percent of wrongdoing was reported to take place company-wide.” (ibid.)

The reporting of misconduct problems has both good and bad news.  Lots of workers (63%) who observed misconduct reported it but 21% of those who reported misconduct said they experienced retaliation**** in return. (p. 13)

The report goes on to examine the details behind the summary results and attempts to assign some possible causes to explain observed trends.  For example, the authors believe it’s probable that positive trends are related to companies’ ethics and compliance programs that create new norms for worker conduct, i.e., a stronger culture. (p. 16)  And a stronger culture is desirable.  Returning to the survey, “In 2013, one in five workers (20 percent) reported seeing misconduct in companies where cultures are “strong” compared to 88 percent who witnessed wrongdoing in companies with the weakest cultures.” (p. 18)

The keys to building a stronger ethical culture are familiar to Safetymatters readers: top-level role models, and support by immediate supervisors and peers to do the right thing.  In terms of cultural artifacts, a stronger ethical culture is visible in an organization’s processes for training, personnel evaluation and application of employee discipline. 

The report goes on to analyze misconduct in depth—who is doing it, what are they doing and how long it has been going on.  The authors cover how and why employees report misconduct and suggest ways to increase the reporting rate.  They note that increased legal protection for whistleblowers has increased the likelihood that covered workers will report misconduct.

Our Perspective

This report is worth a read.  Quite frankly, more workers are willing to report misconduct than I would have predicted.  The percentage of reporters who perceive retaliation is disappointing but hardly surprising.
 

The survey results are more interesting than the explanatory analysis; a reader should keep in mind that this research was conducted by a group that has a vested self-interest in finding the "correct" answers. 

Because specific firms and industries are not identified, it’s easy to blow off the results with a flip “Didn’t happen here and can’t happen here because we have a robust SCWE and ECP.”  I suggest such parochial reviewers keep in mind that “Pride goes before destruction, and a haughty spirit before a fall..”*****


*  Ethics and morals are often used interchangeably but it’s helpful to consider morals as an individual construct, a person’s inner principles of right and wrong.  See diffen.com for a more detailed comparison.

**  Ethics Resource Center, “2013 National Business Ethics Survey of the U.S. Workforce” (Arlington, VA: 2014).  Corporate sponsors include firms familiar to nuclear industry participants, e.g., Bechtel and Edison International.

***  The survey identified 28 specific types of misconduct.  Some of interest to the nuclear industry, listed in the order of frequency of occurrence in the survey responses, include abusive behavior or behavior that creates a hostile work environment, lying to employees, discriminating against employees, violations of health or safety regulations, lying to the public, retaliation against someone who has reported misconduct, abusing substances at work, sexual harassment, violation of environmental regulations and falsifying books and/or records. (pp. 41-42)

****  The survey also identified 13 specific types of retaliation experienced by whistleblowers including being ignored or treated differently by supervisors or other employees, being excluded from decisions, verbal abuse, not receiving promotions or raises, reduced hours or pay, relocation or reassignment, harassment at home or online and physical harm to one’s person or property. (p. 45)

*****  Proverbs 16:18, Bible (English Standard Version).

Sunday, January 17, 2016

A Nuclear Safety Culture for Itinerant Workers

IAEA has published “Radiation Protection of Itinerant Workers”* a report that describes management responsibilities and practices to protect and monitor itinerant workers who are exposed to ionizing radiation.  “Itinerant workers” are people who work at different locations “and are not employees of the management of the facility where they are working. The itinerant workers may be self-employed or employed by a contractor . . .” (p. 4)  In the real world, such employees have many different names including nuclear nomads, glow boys and jumpers.

The responsibility for itinerant workers’ safety and protection is shared among various organizations and the individual.  “The primary responsibility for the protection of workers lies with the management of the operating organization responsible for the facilities . . . however, the employer of the worker (as well as the worker) also bear certain responsibilities.” (p. 2)

Safety culture (SC) is specifically mentioned in the IAEA report.  One basic management responsibility is to promote and maintain a robust SC at all organizational levels. (p. 11)  Specific responsibilities include providing general training in SC behavior and expectations (p. 131) and, where observation or problems reveal specific needs, targeted individual (or small group) SC training. (p. 93)

Our Perspective

This publication is hardly a great victory for SC; the report provides only the most basic description of the SC imperative.  Its major contribution is that it recognizes that itinerant nuclear workers deserve the same safety and protection considerations as other workers at a nuclear facility. 

Back in the bad old days, I was around nuclear organizations where their own employees represented the highest social class, contractors were regarded as replaceable parts, and nomadic workers were not exactly expendable but were considered more responsible for managing their own safety and exposure than permanent personnel.

One can make some judgment about a society’s worth by observing how it treats its lowest status members—the poor, the homeless, the refugee, the migrant worker.  Nuclear itinerant workers deserve to be respected and treated like the other members of a facility’s team.


*  International Atomic Energy Agency, “Radiation protection of itinerant workers,” Safety reports series no. 84 (Vienna, 2015).

Tuesday, November 17, 2015

Foolproof by Greg Ip: Insights for the Nuclear Industry

This book* is primarily about systemic lessons learned from the 2008 U.S. financial crisis and, to a lesser extent, various European euro crises. Some of the author’s observations also apply to the nuclear industry.

Ip’s overarching thesis is that steps intended to protect a system, e.g., a national or global financial system, may over time lead to over-confidence, increased risk-taking and eventual instability.  Stability breeds complacency.**  As we know, a well-functioning system creates a series of successful outcomes, a line of dynamic non-events.  But that dynamic includes gradual changes to the system, e.g., innovation or adaptation to the environment, that may increase systemic risk and result in a new crisis or unintended consequences

He sees examples that evidence his thesis in other fields.  For automobiles, the implementation of anti-lock braking systems leads some operators to drive more recklessly.  In football, better helmets mean increased use of the head as a weapon and more concussions and spinal injuries.  For forest fires, a century of fire suppression has led to massive fuel build-ups and more people moving into forested areas.  For flood control, building more and higher levees has led to increased economic development in historically flood-prone areas.  As a result, both fires and floods can have huge financial losses when they eventually occur.  In all cases, well-intentioned system “improvements” lead to increased confidence (aka loss of fear) and risk-taking, both obvious and implicit.  In short, “If the surroundings seem safer, the systems tolerate more risk.” (p. 18)

Ip uses the nuclear industry to illustrate how society can create larger issues elsewhere in a system when it effects local responses to a perceived problem.  Closing down nuclear plants after an accident (e.g., Fukushima) or because of green politics does not remove the demand for electric energy.  To the extent the demand shortfall is made up with hydrocarbons, additional people will suffer from doing the mining, drilling, processing, etc. and the climate will be made worse.

He cites the aviation industry as an example of a system where near-misses are documented and widely shared in an effort to improve overall system safety.  He notes that the few fatal accidents that occur in commercial aviation serve both as lessons learned and keep those responsible for operating the system (pilots and controllers) on their toes.

He also makes an observation about aviation that could be applied to the nuclear industry: “It is almost impossible to improve a system that never has an accident. . . . regulators are unlikely to know whether anything they propose now will have provable benefits; it also means that accidents will increasingly be of the truly mysterious, unimaginable variety . . .” (p. 252)

Speaking of finance, Ip says “A huge part of what the financial system does is try to create the fact—and at times the illusion—of safety.  Usually, it succeeds; . . . On those rare occasions when it fails, the result is panic.” (p. 86)  Could this description also apply to the nuclear industry? 

Our Perspective

Ip’s search for systemic, dynamic factors to explain the financial crisis echoes the type of analysis we’ve been promoting for years.  Like us, he recognizes that people hold different world views of the same system.  Ip contrasts the engineers and the ecologists:  “Engineers satisfy our desire for control, . . . civilization’s needs to act, to do something, . . .” (p. 278)  Ecologists believe “it’s the nature of risk to find the vulnerabilities we missed, to hit when least expected, to exploit the very trust in safety we so assiduously cultivate with all our protection . . .” (p. 279)

Ip’s treatment of the nuclear industry, while positive, is incomplete and somewhat simplistic.  It’s really just an example, not an industry analysis.  His argument that shutting down nuclear plants exacerbates climate harm could have come from the NEI playbook.  He ignores the impact of renewables, efficiency and conservation.

He doesn’t discuss the nuclear industry’s penchant for secrecy, but we have and believe it feeds the public’s uncertainty about the industry's safety.  As Ip notes, “People who crave certainty cannot tolerate even a slight increase in uncertainty, and so they flee not just the bad banks, the bad paper, and the bad country, but everything that resembles them, . . .” (p. 261)  If a system that is assumed [or promoted] to be safe has a crisis, even a local one, the result is often panic. (p. 62)

He mentions high reliability organizations (HROs) focusing on their avoiding catastrophe and “being a little bit scared all of the time.” (p. 242)  He does not mention that some of the same systemic factors of the financial system are at work in the world of HROs, including exposure to the corrosive effects of complacency and system drift. (p. 242)

Bottom line: Read Foolproof if you have an interest in an intelligible assessment of the financial crisis.  And remember: “Fear serves a purpose: it keeps us out of trouble.” (p. 19)  “. . . but it can keep us from taking risks that could make us better off.” (p. 159)


*  G. Ip, Foolproof (New York: Little, Brown, 2015).  Ip is a finance and economics journalist, currently with the Wall Street Journal and previously with The Economist.

**  He quotes a great quip from Larry Summers: “Complacency is a self-denying prophecy.”  Ip adds, “If everyone worried about complacency, no one would succumb to it.” (p.263)

Tuesday, May 26, 2015

Safety Culture “State of the Art” in 2002 per NUREG-1756

Here’s a trip down memory lane.  Back in 2002 a report* on the “state of the art” in safety culture (SC) thinking, research and regulation was prepared for the NRC Advisory Committee on Reactor Safeguards.  This post looks at some of the major observations of the 2002 report and compares them with what we believe is important today.

The report’s Abstract provides a clear summary of the report’s perspective:  “There is a widespread belief that safety culture is an important contributor to the safety of operations. . . . The commonly accepted attributes of safety culture include good organizational communication, good organizational learning, and senior management commitment to safety. . . . The role of regulatory bodies in fostering strong safety cultures remains unclear, and additional work is required to define the essential attributes of safety culture and to identify reliable performance indicators.” (p. iii) 

General Observations on Safety Performance 


A couple of quotes included in the report reflect views on how safety performance is managed or influenced.

 “"The traditional approach to safety . . . has been retrospective, built on precedents. Because it is necessary, it is easy to think it is sufficient.  It involves, first, a search for the primary (or "root") cause of a specific accident, a decision on whether the cause was an unsafe act or an unsafe condition, and finally the supposed prevention of a recurrence by devising a regulation if an unsafe act,** or a technical solution if an unsafe condition." . . . [This approach] has serious shortcomings.  Specifically, ". . . resources are diverted to prevent the accident that has happened rather than the one most likely to happen."” (p. 24)

“"There has been little direct research on the organizational factors that make for a good safety culture. However, there is an extensive literature if we make the indirect assumption that a relatively low accident plant must have a relatively good safety culture." The proponents of safety culture as a determinant of operational safety in the nuclear power industry rely, at least to some degree, on that indirect assumption.” (p. 37) 

Plenty of people today behave in accordance with the first observation and believe (or act as if they believe) the second one.  Both contribute to the nuclear industry’s unwillingness to consider new ways of thinking about how safe performance actually occurs.

Decision Making, Goal Conflict and the Reward System

Decision making processes, recognition of goal conflicts and an organization’s reward system are important aspects of SC and the report addressed them to varying degrees.

One author referenced had a contemporary view of decision making, noting that “in complex and ill-structured risk situations, decisionmakers are faced not only with the matter of risk, but also with fundamental uncertainty characterized by incompleteness of knowledge.” (p. 43)  That’s true in great tragedies like Fukushima and lesser unfortunate outcomes like the San Onofre steam generators.

Goal conflict was mentioned: “Managers should take opportunities to show that they will put safety concerns ahead of power production if circumstances warrant.” (p.7)

Rewards should promote good safety practices (p. 6) and be provided for identifying safety issues. (p. 37)  However, there is no mention of the executive compensation system.  As we have argued ad nauseam these systems often pay more for production than for safety.

The Role of the Regulator


“The regulatory dilemma is that the elements that are important to safety culture are difficult, if not impossible, to separate from the management of the organization.  [However,] historically, the NRC has been reluctant to regulate management functions in any direct way.” (pp. 37-38)  “Rather, the NRC " . . . infers licensee organization management performance based on a comprehensive review of inspection findings, licensee amendments, event reports, enforcement history, and performance indicators."” (p. 41)  From this starting point, we now have the current situation where the NRC has promulgated its SC Policy Statement and practices de facto SC regulation using the highly reliable “bring me another rock” method.

The Importance of Context when Errors Occur 


There are hints of modern thinking in the report.  It contains an extended summary of Reason’s work in Human Error.  The role of latent conditions, human error as consequence instead of cause, the obvious interaction between producers and production, and the “non-event” of safe operations are all mentioned. (p. 15)  However, a “just culture” or other more nuanced views of the context in which safety performance occurs had yet to be developed.

One author cited described “the paradox that culture can act simultaneously as a precondition for safe operations and an incubator for hazards.” (p. 43)  We see that in Reason and also in Hollnagel and Dekker: people going about business as usual with usually successful results but, on some occasions, with unfortunate outcomes.

Our Perspective

The report’s author provided a good logic model for getting from SC attributes to identifying useful risk metrics, i.e., from SC to one or more probabilistic risk assessment (PRA) parameters.  (pp. 18-20)  But none of the research reviewed completed all the steps in the model. (p. 36)  He concludes “What is not clear is the mechanism by which attitudes, or safety culture, affect the safety of operations.” (p. 43)  We are still talking about that mechanism today.   

But some things have changed.  For example, probabilistic thinking has achieved greater penetration and is no longer the sole province of the PRA types.  It’s accepted that Black Swans can occur (but not at our plant).

Bottom line: Every student of SC should take a look at this.  It includes a good survey of 20th century SC-related research in the nuclear industry and it’s part of our basic history.

“Those who cannot remember the past are condemned to repeat it.” — George Santayana (1863-1952)


*  J.N. Sorensen, “Safety Culture: A Survey of the State-of-the-Art,” NUREG-1756 (Jan. 2002).  ADAMS ML020520006.  (Disclosure: I worked alongside the author on a major nuclear power plant litigation project in the 1980s.  He was thoughtful and thorough, qualities that are apparent in this report.)

**  We would add “or reinforcing an existing regulation through stronger procedures, training or oversight.”

Monday, April 13, 2015

Safety-I and Safety-II: The Past and Future of Safety Management by Erik Hollnagel

This book* discusses two different ways of conceptualizing safety performance problems (e.g., near-misses, incidents and accidents) and safety management in socio-technical systems.  This post describes each approach and provides our perspective on Hollnagel’s efforts.  As usual, our interest lies in the potential value new ways of thinking can offer to the nuclear industry.

Safety-I

This is the common way of looking at safety performance problems.  It is reactive, i.e., it waits for problems to arise** and analytic, e.g., it uses specific methods to work back from the problem to its root causes.  The key assumption is that something in the system has failed or malfunctioned and the purpose of an investigation is to identify the causes and correct them so the problem will not recur.  A second assumption is that chains of causes and effects are linear, i.e., it is actually feasible to start with a problem and work back to its causes.  A third assumption is that a single solution (the “first story”) can be found. (pp. 86, 175-76)***  Underlying biases include the hindsight bias (p. 176) and the belief that the human is usually the weak link. (pp. 78-79)  The focus of safety management is minimizing the number of things that go wrong.

Our treatment of Safety-I is brief because we have reported on criticism of linear thinking/models elsewhere, primarily in the work of Dekker, Woods et al, and Leveson.  See our posts of Dec. 5, 2012; July 6, 2013; and Nov. 11, 2013 for details.

Safety-II

Safety-II is proposed as a different way to look at safety performance.  It is proactive, i.e., it looks at the ways work is actually performed on a day-to-day basis and tries to identify causes of performance variability and then manage them.  A key cause of variability is the regular adjustments people make in performing their jobs in order to keep the system running.  In Hollnagel’s view, “Finding out what these [performance] adjustments are and trying to learn from them can be more important than finding the causes of infrequent adverse outcomes!” (p. 149)  The focus of safety management is on increasing the likelihood that things will go right and developing “the ability to succeed under varying conditions, . . .” (p. 137).

Performance is variable because, among other reasons, people are always making trade-offs between thoroughness and efficiency.  They may use heuristics or have to compensate for something that is missing or take some steps today to avoid future problems.  The underlying assumption of Safety-II is that the same behaviors that almost always lead to successful outcomes can occasionally lead to problems because of performance variability that goes beyond the boundary of the control space.  A second assumption is that chains of causes and effects may be non-linear, i.e., a small variance may lead to a large problem, and may have an emergent aspect where a specific performance variability may occur then disappear or the Swiss cheese holes may momentarily line up exposing the system to latent hazards. (pp. 66, 131-32)  There may be multiple explanations (“second stories”) for why a particular problem occurred.  Finally, Safety-II accepts that there are often differences between Work-as-Imagined (esp. as imagined by folks at the blunt end) and Work-as-Done (by people at the sharp end). (pp. 40-41)***

The Two Approaches

Safety-I and Safety-II are not in some winner-take-all competitive struggle.  Hollnagel notes there are plenty of problems for which a Safety-I investigation is appropriate and adequate. (pp. 141, 146)

Safety-I expenditures are viewed as a cost (to reduce errors). (p. 57)  In contrast, Safety-II expenditures are viewed as bona fide investments to create more correct outcomes. (p. 166)

In all cases, organizational factors, such as safety culture, can impact safety performance and organizational learning. (p. 31)

Our Perspective

The more complex a socio-technical entity is, the more it exhibits emergent properties and the more appropriate Safety-II thinking is.  And nuclear has some elements of complexity.****  In addition, Hollnagel notes that a common explanation for failures that occur in a System-I world is “it was never imagined something like that could happen.” (p. 172)  To avoid being the one in front of the cameras saying that, it might be helpful for you to spend a little time reflecting on how System-II thinking might apply in your world.

Why do most things go right?  Is it due to strict compliance with procedures?  Does personal creativity or insight contribute to successful plant performance?  Do you talk with your colleagues about possible efficiency-thoroughness trade-offs (short cuts) that you or others make?  Can thinking about why things go right make one more alert to situations where things are heading south?  Does more automation (intended to reduce reliance on fallible humans) actually move performance closer to the control boundary because it removes the human’s ability to make useful adjustments?  Has any of your root cause evaluations appeared to miss other plausible explanations for why a problem occurred?

Some of the Safety-II material is not new.  Performance variability in Safety-II builds on Hollnagel’s earlier work on the efficiency-thoroughness trade-off (ETTO) principle.  (See our Jan. 3, 2013 post.)   His call for mindfulness and constant alertness to problems is straight out of the High Reliability Organization playbook. (pp. 36, 163-64)  (See our May 3, 2013 post.)

A definite shortcoming is the lack of concrete examples in the Safety-II discussion.  If someone has tried to do this, it would be nice to hear about it.

Bottom line, Hollnagel has some interesting observations although his Safety-II model is probably not the Next Big Thing for nuclear safety management.

 

*  E. Hollnagel, Safety-I and Safety-II: The Past and Future of Safety Management  (Burlington, VT: Ashgate , 2014)

**  In the author’s view, forward-looking risk analysis is not proactive because it is infrequently performed. (p. 57) 

***  There are other assumptions in the Safety-I approach (see pp. 97-104) but for the sake of efficiency, they are omitted from this post.

****  Nuclear power plants have some aspects of a complex socio-technical system but other aspects are merely complicated.   On the operations side, activities are tightly coupled (one attribute of complexity) but most of the internal organizational workings are complicated.  The lack of sudden environmental disrupters (excepting natural disasters) means they have time to adapt to changes in their financial or regulatory environment, reducing complexity.

Sunday, March 29, 2015

Nuclear Safety Assessment Principles in the United Kingdom

A reader sent us a copy of “Safety Assessment Principles for Nuclear Facilities” (SAPs) published by the United Kingdom’s Office for Nuclear Regulation (ONR).*  For documents like this, we usually jump right to the treatment of safety culture (SC).  However, in this case we were impressed with the document’s accessibility, organization and integrated (or holistic) approach so we want to provide a more general review.

ONR uses the SAPs during technical assessments of nuclear licensees’ safety submissions.  The total documentation package developed by a licensee to demonstrate high standards of nuclear safety is called the “safety case.”

Accessibility

The language is clear and intended for newbies as well as those already inside the nuclear tent.  For example, “The SAPs contain principles and guidance.  The principles form the underlying basis for regulatory judgements made by inspectors, and the guidance associated with the principles provides either further explanation of a principle, or their interpretation in actual applications and the measures against which judgements can be made.” (p. 11) 

Also furthering ease of use, the document is not strewn with acronyms.  As a consequence, one doesn’t have to sit with glossary in hand just to read the text.

Organization

ONR presents eight fundamental principles including responsibility for safety, limitation of risks to individuals and emergency planning.  We’ll focus on another fundamental principle, Leadership and Management (L&M) because (a) L&M activities create the context and momentum for a positive SC and (b) it illustrates holistic thinking.

L&M is comprised of four subordinate (but still high-level) inter-related principles: leadership, capable organization, decision making and learning.  “Because of their inter-connected nature there is some overlap between the principles. They should therefore be considered as a whole and an integrated approach will be necessary for their delivery.” (p. 18)

Drilling down further, the guidance for leadership includes many familiar attributes.  We want to acknowledge attributes we have been emphasizing on Safetymatters or reflect new thoughts.  Specifically, leaders must recognize and resolve conflict between safety and other goals, ensure that the reward systems promote the identification and management of risk, encourage safe behavior and discourage unsafe behavior or complacency; and establish a common purpose and collective social responsibility for safety. (p.19) 

Decision making (another Safetymatters hot button issue) receives a good treatment.  Topics covered include explicit recognition of goal conflict; appreciating the potential for error, uncertainty and the unexpected; and the essential functions of active challenges and a questioning attitude.

We do have one bone to pick under L&M: we would like to see words to the effect that safety performance and SC should be significant components of the senior management reward system.

Useful Points

Helpful nuggets pop up throughout the text.  A few examples follow.

“The process of analysing safety requires creativity, where people can envisage the variety of routes by which radiological risks can arise from the technology. . . . Safety is achieved when the people and physical systems together reliably control the radiological hazards inherent in the technology. Therefore the organizational systems (ie interactions between people) are just as important as the physical systems, . . . “ (pp. 25-26)

“[D]esigners and/or dutyholders may wish to put forward safety cases that differ from [SAP] expectations.   As in the past, ONR inspectors should consider such submissions on their individual merits. . . . ONR will need to be assured that such cases demonstrate equivalence to the outcomes associated with the use of the principles here,. . .” (p. 14)  The unstated principle here is equifinality; in more colorful words, there is more than one way to skin a cat.

There are echoes of other lessons we’ve been preaching on Safetymatters.  For example “The principle of continuous improvement is central to achieving sustained high standards of nuclear safety. . . . Seeking and applying lessons learned from events, new knowledge and experience, both nationally and internationally, must be a fundamental feature of the safety culture of the nuclear industry.” (p. 13)

And, in a nod to Nicholas Taleb, if a “hazard is particularly high, or knowledge of the risk is very uncertain, ONR may choose to concentrate primarily on the hazard.” (p. 8)

Our Perspective

Most of the content of the SAPs will be familiar to Safetymatters readers.  We suggest you skim the first 23 pages of the document covering introductory material and Leadership & Management.  SAPs is an excellent example of a regulator actually trying to provide useful information and guidance to current and would-be licensees and is far better than the simple-minded laundry lists promulgated by IAEA.


*  Office for Nuclear Regulation, “Safety Assessment Principles for Nuclear Facilities” Rev. 0 (2014).  We are grateful to Bill Mullins for forwarding this document to us.

Friday, December 12, 2014

IAEA Training Workshop on Leadership and Safety Culture for Senior Managers, Nov. 18-21, 2014


IAEA Building

The International Atomic Energy Agency (IAEA) recently conducted a four-day workshop* on leadership and safety culture (SC).  “The primary objective of the workshop [was] to provide an international forum for senior managers to share their experience and learn more about how safety culture and leadership can be continuously improved.” (Opening, Haage)  We don’t have all the information that was shared at the workshop but we can review the workshop facilitators’ presentations.  The facilitators were John Carroll, an MIT professor who is well-known in the nuclear SC field; Liv Cardell, Swedish management consultant; Stanley Deetz, professor at the University of Colorado; Michael Meier, Regulatory Affairs VP at Southern Nuclear OpCo; and Monica Haage, IAEA SC specialist and the workshop leader.  Their presentations follow in the approximate order they were made at the workshop, based on the published agenda.

Shared Space, Haage

The major point is how individual performance is shaped by experience in the social work space shared with others, e.g., conversations, meetings, teams, etc.  Haage described the desirable characteristics of such “shared space” including trust, decrease of power dynamics, respect, openness, freedom to express oneself without fear of recrimination, and dialogue instead of argumentation. 

The goal is to tap into the knowledge, experience and insight in the organization, and to build shared understandings that support safe behaviors and good performance.  In a visual of an iceberg, shared understanding is at the bottom, topped by values, which underlie attitudes, and visible behavior is above the waterline.

Leadership for Safety, Carroll and Haage

Haage covered the basics from various IAEA documents: “management” is a function and “leadership” is a relation to influence others and create shared understanding.  Safety leadership has to be demonstrated by managers at all levels.  There is a lengthy list of issues, challenges and apparent paradoxes that face nuclear managers.

Carroll covered the need for leaders who have a correct view of safety (in contrast to, e.g., BP’s focus on personal safety rather than systemic issues) and can develop committed employees who go beyond mere compliance with requirements.  He provided an interesting observation that culture is only one perspective (mental model) of an organization; alternative perspectives include strategic design (which views the organization as a machine) and political (which focuses on contests to set priorities and obtain resources).  He mentioned the Sloan management model (sensemaking, visioning, relating and implementing).  Carroll reviewed the Millstone imbroglio of the 1990s including his involvement, situational factors and the ultimate resolution then used this as a workshop exercise to identify root causes and develop actionable fixes.  He showed how to perform a stakeholder assessment to identify who is likely to lead, follow, oppose or simply bystand when an organization faces a significant challenge.

Management for Safety, Haage

This presentation had an intro similar to Leadership followed by a few slides on management.  Basically, the management system is the administrative structure and associated functions (plan, organize, direct, control) that measures and ensures progress toward established safety goals within rules and available resources and does not allow safety to be trumped by other requirements or demands.

Concept of Culture, Deetz

Culture is of interest to managers because it supports the hope for invisible control with less resistance and greater commitment.  Culture is a perspective, a systemic way to look at values, practices, etc. and a tacit part of all choices.  Culture is seen as something to be influenced rather than controlled.  Cultural change can be attempted but the results to not always work out as planned.  The iceberg metaphor highlights the importance of interpretation when it comes to culture, since what we can observe is only a small part and we must infer the rest.

Culture for Safety, Meier

This is a primer on SC definition, major attributes and organizational tactics for establishing, maintaining and improving SC.  One key attribute is that safety is integrated into rewards and recognitions.  Meier observed that centralization ensures compliance while decentralization [may] help to mitigate accident conditions.

Systemic Approach to Safety, Haage

A systemic approach describes the interaction between human, technical and organizational (HTO) factors.  Haage noted that the usual approach to safety analysis is to decompose the system; this tends to overemphasize technical factors.  A systemic approach focuses on the dynamics of the HTO interactions to help evaluate their ability to produce safety outcomes.  She listed findings and recommendations from SC researchers, including HRO characteristics, and the hindsight bias vs. the indeterminacy of looking ahead (from Hollnagel).

Being Systemic, Deetz

This short presentation lists the SC Challenges faced by workshop participants as presented by groups in the workshop.  The 16-item list would look familiar to any American nuclear manager; most of you would probably say it’s incomplete.

Cultural Work in Practice, Cardell

Cardell’s approach to improving performance starts by separating the hard structural attributes from the softer cultural ones.  An organization tries to improve structure and culture to yield organizational learning.  Exaggerating the differences between structure and culture raises consciousness and achieves balance between the two aspects.

Culture comes from processes between people; meetings are the cradle of culture (this suggests the shared space concept).  Tools to develop culture include dialogue, questioning, storytelling, involving, co-creating, pictures, coaching and systemic mapping.  Cardell suggested large group dialogs with members from all organizational elements.  This is followed by a cookbook of suggestions (tools) for improving cultural processes and attributes. 

Our Perspective

It’s hard to avoid being snarky when dealing with IAEA.  They aim their products at the lowest common denominator of experience and they don’t want to offend anyone.  As a result, there is seldom anything novel or even interesting in their materials.  This workshop is no exception.

The presentations ranged from the simplistic to the impossibly complicated.  There was scant reference to applicable lessons from other industries (which subtly reinforces the whole “we’re unique” and “it can’t happen here” mindset) or contemporary ideas about how socio-technical systems operate.  The strategic issue nuclear organizations face is goal conflict: safety vs production vs cost.  This is mentioned in the laundry lists of issues but did not get the emphasis it deserves.  Similar for decision making and resource allocation.  The primary mechanism by which a strong SC identifies and permanently fixes its problems (the CAP) was not mentioned at all.  And for all the talk about a systemic approach, there was no mention of actual system dynamics (feedback loops, time delays, multi-directional flows) and how the multiple interactions between structure and culture might actually work.

Bottom line: There was some “there” there but nothing new.  I suggest you flip through the Carroll and Cardell presentations for any tidbits you can use to spice up or flesh out your own work.
  
A Compendium was sent to the attendees before the workshop.  It contained facilitator biographies and some background information on SC. It included a paper by Prof. Deetz on SC change as a rearticulation of relationships among concepts.  It is an attempt to get at a deeper understanding of how culture fits and interacts with individuals’ sense of identity and meaning.  You may not agree with his thesis but the paper is much more sophisticated than the materials shared during the workshop.


*  IAEA Training Workshop on Leadership and Safety Culture for Senior Managers, Nov. 18-21, 2014, Vienna.  The presentations are available here.  We are grateful to Madalina Tronea for publicizing this material.  Dr. Tronea is the founder and moderator of the LinkedIn Nuclear Safety Culture forum.