Showing posts with label Schein. Show all posts
Showing posts with label Schein. Show all posts

Tuesday, May 28, 2019

The Study of Organizational Culture: History, Assessment Methods, and Insights

We came across an academic journal article* that purports to describe the current state of research into organizational culture (OC).  It’s interesting because it includes a history of OC research and practice, and a critique of several methods used to assess it.  Following is a summary of the article and our perspective on it, focusing on any applicability to nuclear safety culture (NSC).

History

In the late 1970s scholars studying large organizations began to consider culture as one component of organizational identity.  In the same time frame, practicing managers also began to show an interest in culture.  A key driver of their interest was Japan’s economic ascendance and descriptions of Japanese management practices that depended heavily on cultural factors.  The notion of a linkage between culture and organizational performance inspired non-Japanese managers to seek out assistance in developing culture as a competitive advantage for their own companies.  Because of the sense of urgency, practical applications (usually developed and delivered by consultants) were more important than developing a consistent, unified theory of OC.  Practitioners got ahead of researchers and the academic world has yet to fully catch up.

Consultant models only needed a plausible, saleable relationship between culture and organizational performance.  In academic terms, this meant that a consultant’s model relating culture to performance only needed some degree of predictive validity.  Such models did not have to exhibit construct validity, i.e., some proof that they described, measured, or assessed a client organization’s actual underlying culture.  A second important selling point was the consultants’ emphasis on the singular role of the senior leaders (i.e., the paying clients) in molding a new high-performance culture.

Over time, the emphasis on practice over theory and the fragmented efforts of OC researchers led to some distracting issues, including the definition of OC itself, the culture vs. climate debate, and qualitative vs. quantitative models of OC. 

Culture assessment methods 


The authors provide a detailed comparison of four quantitative approaches for assessing OC: the Denison Organizational Culture Survey (used by more than 5,000 companies), the Competing Values Framework (used in more than 10,000 organizations), the Organizational Culture Inventory (more than 2,000,000 individual respondents), and the Organizational Culture Profile (OCP, developed by the authors and used in a “large number” of research studies).  We’ll spare you the gory details but unsurprisingly, the authors find shortcomings in all the approaches, even their own. 

Some of this criticism is sour grapes over the more popular methods.  However, the authors mix their criticism with acknowledgement of functional usefulness in their overall conclusion about the methods: because they lack a “clear definition of the underlying construct, it is difficult to know what is being measured even though the measure itself has been shown to be reliable and to be correlated with organizational outcomes.” (p. 15)

Building on their OCP, the authors argue that OC researchers should start with the Schein three-level model (basic assumptions and beliefs, norms and values, and cultural artifacts) and “focus on the norms that can act as a social control system in organizations.” (p. 16)  As controllers, norms can be descriptive (“people look to others for information about how to act and feel in a given situation”) or injunctive (how the group reacts when someone violates a descriptive norm).  Attributes of norms include content, consensus (how widely they are held), and intensity (how deeply they are held).

Our Perspective

So what are we to make of all this?  For starters, it’s important to recognize that some of the topics the academics are still quibbling over have already been settled in the NSC space.  The Schein model of culture is accepted world-wide.  Most folks now recognize that a safety survey, by itself, only reflects respondents’ perceptions at a specific point in time, i.e., it is a snapshot of safety climate.  And a competent safety culture assessment includes both qualitative and quantitative data: surveys, focus groups, interviews, observations, and review of artifacts such as documents.

However, we may still make mistakes.  Our mental models of safety culture may be incomplete or misassembled, e.g., we may see a direct connection between culture and some specific behavior when, in reality, there are intervening variables.  We must acknowledge that OC can be a multidimensional sub-system with complex internal relationships interacting with a complicated socio-technical system surrounded by a larger legal-political environment.  At the end of the day, we will probably still have some unknown unknowns.

Even if we follow the authors’ advice and focus on norms, it remains complicated.  For example, it’s fairly easy to envision that safety could be a widely agreed upon, but not intensely held, norm; that would define a weak safety culture.  But how about safety and production and cost norms in a context with an intensely held norm about maintaining good relations with and among long-serving coworkers?  That could make it more difficult to predict specific behaviors.  However, people might be more likely to align their behavior around the safety norm if there was general consensus across the other norms.  Even if safety is the first among equals, consensus on other norms is key to a stronger overall safety culture that is more likely to sanction deviant behavior.
 
The authors claim culture, as defined by Schein, is not well-investigated.  Most work has focused on correlating perceptions about norms, systems, policies, procedures, practices and behavior (one’s own and others’) to organizational effectiveness with a purpose of identifying areas for improvement initiatives that will lead to increased effectiveness.  The manager in the field may not care if diagnostic instruments measure actual culture, or even what culture he has or needs; he just wants to get the mission accomplished while avoiding the opprobrium of regulators, owners, bosses, lawmakers, activists and tweeters. If your primary focus is on increasing performance, then maybe you don’t need to know what’s under the hood. 

Bottom line: This is an academic paper with over 200 citations but is quite readable although it contains some pedantic terms you probably don’t hear every day, e.g., the ipsative approach to ranking culture attributes (ordinary people call this “forced choice”) and Q factor analysis.**  Some of the one-sentence descriptions of other OC research contain useful food for thought and informed our commentary in this write-up.  There is a decent dose of academic sniping in the deconstruction of commercially popular “culture” assessment methods.  However, if you or your organization are considering using one of those methods, you should be aware of what it does, and doesn’t, incorporate. 


*  J.A. Chatman and C.A. O’Reilly, “Paradigm lost: Reinvigorating the study of organizational culture,” Research in Organizational Behavior (2016).  Retrieved May 28, 2019.

**  “Normal factor analysis, called "R method," involves finding correlations between variables (say, height and age) across a sample of subjects. Q, on the other hand, looks for correlations between subjects across a sample of variables. Q factor analysis reduces the many individual viewpoints of the subjects down to a few "factors," which are claimed to represent shared ways of thinking.”  Wikipedia, “Q methodology.”   Retrieved May 28, 2019.

Wednesday, May 10, 2017

A Nordic Compendium on Nuclear Safety Culture

A new research paper* covers the challenges of establishing and improving nuclear safety culture (NSC) in a dynamic, i.e., project, environment.  The authors are Finnish and Swedish and it appears the problems of the Olkiluoto 3 plant inform their research interests.  Their summary and review of current NSC literature is of interest to us. 

They begin with an overall description of how organizational (and cultural) changes can occur in terms of direction, rate and scale.

Direction

Top-down (or planned) change relies on the familiar unfreeze-change-refreeze models of Kurt Lewin and Ed Schein.  Bottom-up (or emergent) change emphasizes self-organization and organizational learning.  Truly free form, unguided change leads to NSC being an emergent property of the organization.  As we know, the top-down approach is seldom, if ever, 100% effective because of frictional losses, unintended consequences or the impact of competing, emergent cultural currents.  In a nod to a systems perspective, the authors note organizational structures and behavior influence (and are influenced by) culture.

Rate

“Organizational change can also be distinguished by the rate of its occurrence, i.e, whether the change occurs abruptly or smoothly [italics added].” (p. 8)  We observe that most nuclear plants try to build on past success, hence they promote “continuous improvement” programs that don’t rattle the organization.  In contrast, a plant with major NSC problems sometimes receives shock treatment, often in the form of a new senior manager who is expected to clean things up.  New management systems and organizational structures can also cause abrupt change.

Scale

The authors identify four levels of change.  Most operating plants exhibit the least disruptive changes, called fine tuning and incremental adjustmentModular transformation attempts to change culture at the department level; corporate transformation is self-explanatory. 

The authors sound a cautionary note: “the more radical types of changes might not be easily initiated – or might not even be feasible, considering that safety culture is by nature a slowly and progressively changing phenomenon. The obvious condition where a safety-critical organization requires radical changes to its safety culture is when it is unacceptably unhealthy.” (p. 9)

Culture Change Strategies

The authors list seven specific strategies for improving NSC:

  • Change organizational structures,
  • Modify the behavior of a target group through, e.g. incentives and positive reinforcement,
  • Improve interaction and communication to build a shared culture,
  • Ensure all organizational members are committed to safety and jointly participate in its improvement,
  • Training,
  • Promote the concept and importance of NSC,
  • Recruit and select employees who will support a strong NSC.
This section includes a literature review for examples of the specific strategies.

Project Organizations

The nature of project organizations is discussed in detail including their time pressures, wide use of teams, complex tasks and a context of a temporary organization in a relatively permanent environment.  The authors observe that “in temporary organisations, the threat of prioritizing “production” over safety may occur more naturally than in permanent organizations.” (pp. 16-17)  Projects are not limited to building new plants; as we have seen, large projects (Crystal River containment penetration, SONGS steam generator replacement) can kill operating plants.

The balance of the paper covers the authors’ empirical work.

Our Perspective 


This is a useful paper because it provides a good summary of the host of approaches and methods that have been (and are being) applied in the NSC space.  That said, the authors offer no new insights into NSC practice.

Although the paper’s focus is on projects, basically new plant construction, people responsible for fixing NSC at problem plants, e.g., Watts Bar, should peruse this report for lessons they can apply that might help achieve the step function NSC improvements such plants need.


*  K.Viitanen, N. Gotcheva and C. Rollenhagen, “Safety Culture Assurance and Improvement Methods in Complex Projects – Intermediate Report from the NKS-R SC AIM” (Feb. 2017).  Thanks to Aili Hunt of the LinkedIn Nuclear Safety Culture group for publicizing this paper.

Thursday, March 17, 2016

IAEA Nuclear Safety Culture Conference

The International Atomic Energy Agency (IAEA) recently sponsored a week-long conference* to celebrate 30 years of interest and work in safety culture (SC).  By our reckoning, there were about 75 individual presentations in plenary sessions and smaller groups; dialog sessions with presenters and subject matter experts; speeches and panels; and over 30 posters.  It must have been quite a circus.

We cannot justly summarize the entire conference in this space but we can highlight material related to SC factors we’ve emphasized or people we’ve discussed on Safetymatters, or interesting items that merit your consideration.

Topics We Care About

A Systems Viewpoint

Given that the IAEA has promoted a systemic approach to safety and it was a major conference topic it’s no surprise that many participants addressed it.  But we were still pleased to see over 30 presentations, posters and dialogues that included mention of systems, system dynamics, and systemic and/or holistic viewpoints or analyses.  Specific topics covered a broad range including complexity, coupling, Fukushima, the Interaction between Human, Technical and Organizational Factors (HTOF), error/incident analysis, regulator-licensee relationships, SC assessment, situational adaptability and system dynamics.

Role of Leadership

Leadership and Management for Safety was another major conference topic.  Leadership in a substantive context was mentioned in about 20 presentations and posters, usually as one of multiple success factors in creating and maintaining a strong SC.  Topics included leader/leadership commitment, skills, specific competences, attributes, obligations and responsibilities; leadership’s general importance, relationship to performance and role in accidents; and the importance of leadership in nuclear regulatory agencies. 

Decision Making

This was mentioned about 10 times, with multiple discussions of decisions made during the early stages of the Fukushima disaster.  Other presenters described how specific techniques, such as Probabilistic Risk Assessment and Human Reliability Analysis, or general approaches, such risk control and risk informed, can contribute to decision making, which was seen as an important component of SC.

Compensation and Rewards

We’ve always been clear: If SC and safety performance are important then people from top executives to individual workers should be rewarded (by which we mean paid money) for doing it well.  But, as usual, there was zero mention of compensation in the conference materials.  Rewards were mentioned a few times, mostly by regulators, but with no hint they were referring to monetary rewards.  Overall, a continuing disappointment.   

Participants Who Have Been Featured in Safetymatters

Over the years we have presented the work of many conference participants to Safetymatters readers.  Following are some familiar names that caught our eye.
  Page numbers refer to the conference “Programme and Abstracts” document.
 
We have to begin with Edgar Schein, the architect of the cultural construct used by almost everyone in the SC space.  His discussion paper (p. 47) argued that the SC components in a nuclear plant depend on whether the executives actually create the climate of trust and openness that the other attributes hinge on.  We’ve referred to Schein so often he has his own label on Safetymatters.

Mats Alvesson’s presentation
(p. 46) discussed “hyper culture,” the vague and idealistic terms executives often promote that look good in policy documents but seldom work well in practice.  This presentation is consistent with his article on Functional Stupidity which we reviewed on Feb. 23, 2016.

Sonja Haber’s paper (p. 55) outlined a road map for the nuclear community to move forward in the way it thinks about SC.  Dr. Haber has conducted many SC assessments for the Department of Energy that we have reviewed on Safetymatters. 

Ken Koves of INPO led or participated in three dialogue sessions.  He was a principal researcher in a project that correlated SC survey data with safety performance measures which we reviewed on Oct. 22, 2010 and Oct. 5, 2014.

Najmedin Meshkati discussed (p. 60) how organizations react when their control systems start to run behind environmental demands using Fukushima as an illustrative case.  His presentation draws on an article he coauthored comparing the cultures at TEPCO’s Fukushima Daiichi plant and Tohoku Electric’s Onagawa plant which we reviewed on Mar. 19, 2014.

Jean-Marie Rousseau co-authored a paper (p. 139) on the transfer of lesson learned from accidents in one industry to another industry.  We reviewed his paper on the effects of competitive pressures on nuclear safety management issues on May 8, 2013.

Carlo Rusconi discussed (p. 167) how the over-specialization of knowledge required by decision makers can result in pools of knowledge rather than a stream accessible to all members of an organization.  A systemic approach to training can address this issue.  We reviewed Rusconi’s earlier papers on training on June 26, 2013 and Jan. 9, 2014.

Richard Taylor’s presentation (p. 68) covered major event precursors and organizations’ failure to learn from previous events.  We reviewed his keynote address at a previous IAEA conference where he discussed using system dynamics to model organizational archetypes on July 31, 2012.

Madalina Tronea talked about (p. 114) the active oversight of nuclear plant SC by the National Commission for Nuclear Activities Control (CNCAN), the Romanian regulatory authority.  CNCAN has developed its own model of organizational culture and uses multiple methods to collect information for SC assessment.  We reviewed her initial evaluation guidelines on Mar. 23, 2012

Our Perspective

Many of the presentations were program descriptions or status reports related to the presenter’s employer, usually a utility or regulatory agency.  Fukushima was analyzed or mentioned in 40 different papers or posters.  Overall, there were relatively few efforts to promote new ideas, insights or information.  Having said that, following are some materials you should consider reviewing.

From the conference participants mentioned above, Haber’s abstract (p. 55) and Rusconi’s abstract (p. 167) are worth reading.  Taylor’s abstract (p. 68) and slides are also worth reviewing.  He advocates using system dynamics to analyze complicated issues like the effectiveness of organizational learning and how events can percolate through a supply chain.

Benoît Bernard described the Belgian regulator’s five years of experience assessing nuclear plant SC.  Note that lessons learned are described in his abstract (p. 113) but are somewhat buried in his presentation slides.

If you’re interested in a systems view of SC, check out Francisco de Lemos’ presentation
(p. 63) which gives a concise depiction of a complex system plus a Systems Theoretic Accident Models and Processes (STAMP) analysis.  His paper is based on Nancy Leveson’s work which we reviewed on Nov. 11, 2013.

Diana Engström argued that nuclear personnel can put more faith in reported numbers than justified by the underlying information, e.g., CAP trending data, and thus actually add risk to the overall system.  We’d call this practice an example of functional stupidity although she doesn’t use that term in her provocative paper.  Both her abstract (p. 126) and slides are worth reviewing.

Jean Paries gave a talk on the need for resilience in the management of nuclear operations.  The abstract (p. 228) is clear and concise; there is additional information in his slides but they are a bit messy.

And that’s it for this installment.  Be safe.  Please don’t drink and text.



*  International Atomic Energy Agency, International Conference on Human and Organizational Aspects of Assuring Nuclear Safety: Exploring 30 years of Safety Culture (Feb. 22–26, 2016).  This page shows the published conference materials.  Thanks to Madalina Tronea for publicizing them.  Dr. Tronea is the founder/moderator of the LinkedIn Nuclear Safety Culture discussion group. 

Wednesday, September 10, 2014

A Safety Culture Guide for Regulators

This paper* was referenced in a safety culture (SC) presentation we recently reviewed.  It was prepared for Canadian offshore oil industry regulators.  Although not nuclear oriented, it’s a good introduction to SC basics, the different methods for evaluating SC and possible approaches to regulating SC.  We’ll summarize the paper then provide our perspective on it.  The authors probably did not invent anything other than the analysis discussed below but they used a decent set of references and picked appropriate points to highlight.

Introduction to SC and its Importance

 
The paper provides some background on SC, its origins and definition, then covers the Schein three-tier model of culture and the difference between SC and safety climate.  The last topic is covered concisely and clearly: “. . . safety climate is an outward manifestation of culture. Therefore, safety culture includes safety climate, but safety culture uniquely includes shared values about risk and safety.” (p. 11)  SC attributes (from the Canadian Nuclear Safety Commission) are described.  Under attributes, the authors stress one of our basic beliefs, viz., “The importance of safety is made clear by the decisions managers make and how they allocate resources.” (p. 12)  The authors also summarize the characteristics of High Reliability Organizations, Low Accident Organizations, and James Reason’s model of SC and symptoms of poor SC.

The chapter on SC as a causal factor in accidents contains an interesting original analysis.  The authors reviewed reports on 17 offshore or petroleum related accidents (ranging from helicopter crashes to oil rig explosions) and determined for each accident which of four negative SC factors (Normalization of deviance, Tolerance of inadequate systems and resources, Complacency, Work pressure) were present.  The number of negative SC factors per accident ranged from 0 (three instances) to 4 (also three instances, including two familiar to Safetymatters readers: BP Texas City and Deepwater Horizon).  The negative factor that appeared in the most accidents was Tolerance of inadequate systems and resources (10) and the least was Work pressure (4).

Assessing SC

 
The authors describe different SC assessment methods (questionnaires, interviews, focus groups, observations and document analysis) and cover the strengths and weaknesses of each method.  The authors note that no single method provides a comprehensive SC assessment and they recommend a multi-method approach.  This is familiar ground for Safetymatters readers; for other related posts, click on the “Assessment” label in the right hand column.

A couple of highlights stand out.  Under observations the authors urge caution:  “The fact that people are being observed is likely to influence their behaviour [the well-known Hawthorne Effect] so the results need to be treated with caution. The concrete nature of observations can result in too much weight being placed on the results of the observation versus other methods.“ (p. 37)  A strength of document analysis is it can evidence how (and how well) the organization identifies and corrects its problems, another key artifact in our view.

Influencing SC

This chapter covers leadership and the regulator’s role.  The section on leadership is well-trod ground so we won’t dwell on it.  It is a major (but in our opinion not the only) internal factor that can influence the evolution of SC.  The statement that “Leaders also shape the safety culture through the allocation of resources” (p. 42) is worth repeating.

The section on regulatory influence is more informative and describes three methods: the regulator’s practices, promotion of SC, and enforcement of SC regulations.  Practices refer to the ways the regulator goes about its inspection and enforcement activities with licensees.  For example, the regulator can promote organizational learning by requiring licensees to have effective incident investigation systems and monitoring how effectively such systems are used in practice. (p. 44)  In the U.S. the NRC constantly reinforces SC’s importance and, through its SC Policy Statement, the expectation that licensees will strive for a strong SC.

Promoting SC can occur through research, education and direct provision of SC-related services.  Regulators in other countries conduct their own surveys of industry personnel to appraise safety climate or they assess an organization’s SC and report their findings to the regulated entity.**  (pp. 45-46)  The NRC both supports and cooperates with industry groups on SC research and sponsors the Regulatory Information Conference (which has a SC module).

Regulation of SC means just what it says.  The authors point out that direct regulation in the offshore industry is controversial. (p. 47)  Such controversy notwithstanding, Norway has developed  regulations requiring offshore companies to promote a positive SC.  Norway’s experience has shown that SC regulations may be misinterpreted or result in unintended consequences. (pp. 48-50)  In the nuclear space, regulation of SC is a popular topic outside the U.S.; the IAEA even has a document describing how to go about it, which we reviewed on May 15, 2013.  More formal regulatory oversight of SC is being developed in Romania and Belgium.  We reported on the former on April 21, 2014 and the latter on June 23, 2014.

Our Perspective

 
This paper is written by academics but intended for a more general audience; it is easy reading.  The authors score points with us when they say: “Importantly, safety culture moves the focus beyond what happened to offer a potential explanation of why it happened.” (p. 7)  Important factors such as management decision making and work backlogs are mentioned.  The importance of an effective CAP is hinted at.

The paper does have some holes.  Most importantly, it limits the discussion on influencing SC to leadership and regulatory behavior.  There are many other factors that can affect an organization’s SC including existing management systems; the corporate owner’s culture, goals, priorities and policies; market factors or economic regulators; and political pressure.  The organization’s reward system is referred to multiple times but the focus appears to be on lower-level personnel; the management compensation scheme is not mentioned.

Bottom line: This paper is a good introduction to SC attributes, assessments and regulation.


*  M. Fleming and N. Scott, “A Regulator’s Guide to Safety Culture and Leadership” (no date).

**  No regulations exist in these cases; the regulator assesses SC and then uses its influence and persuasion to affect regulated entity behavior.

Monday, June 9, 2014

DNFSB Observations on Safety Culture

DNFSB Headquarters
The Defense Nuclear Facilities Safety Board (DNFSB) has been busy in the safety culture (SC) space.  First, their Chairman’s May 7, 2014 presentation on preventing major accidents provides a window into how the DNFSB views safety management and SC in the DOE complex.  Second, the DNFSB’s meeting on May 28, 2014 heard presentations on SC concepts from industry and government experts.  This post reviews and provides our perspective on both events. 

Chairman’s Presentation

This presentation was made at a DOE workshop.*  Chairman Winokur opened with some examples of production losses that followed incidents at DOE facilities and concluded the cost of safety is small compared to the cost of an accident.  He went on to discuss organizational factors that can set the stage for accidents or promote improved safety performance.  Some of these factors are tied to SC and will be familiar to Safetymatters readers.  They include the following:

Leadership

The presentation quotes Schein: “The only thing of real importance that leaders do is to create and manage culture.” (p. 13)  This quote is used by many in the nuclear industry to support a direct and complete connection between leadership and an organization’s culture.   While effective leadership is certainly necessary, we have long argued for a more nuanced view, viz., that leaders influence but do not unilaterally define culture.  In fact, on the same page in Organizational Culture, Schein says “Culture is the result of a complex group learning process that is only partially influenced by leader behavior.” **

Budget and production pressures and
Rewards that favor mission over safety
 


As Winokur pointed out, it is unfortunately true that poor safety performance (accidents and incidents) can attract resources while good safety performance can lead to resources being redirected.  Good safety performance becomes taken for granted and is largely invisible.  “Always focus on balancing mission and safety.  There will always be trade-offs, but safety should not get penalized for success.” (p. 19) 

On our part, we feel like we’ve been talking about goal conflicts forever.  The first step in addressing goal conflicts is to admit they exist, always have and probably always will.  The key to resolving them is not by issuing a safety policy, it is to assure that an entity’s decision making process and its reward and compensation system treat safety with the priority it warrants. 

Decision making

Winokur says “Understand the nature of low-probability, high-consequence accidents driven by inadequate control of uncertainty, not cause-effect relationships . . .” (p. 14) and “Risk-informed decision making can be deceptive; focus on consequences, as well as probabilities.” (p. 16)  These observations are directly compatible with Nicholas Taleb: “This idea that in order to make a decision you need to focus on the consequences (which you can know) rather than the probability (which you can’t know) is the central idea of uncertainty.”***  See our June 18, 2013 post for a discussion of decisions that led to high-consequence (i.e., really bad) outcomes at Crystal River, Kewaunee and San Onofre.

There is no additional material in the presentation for a few important factors, so we will repeat earlier Safetymatters commentary on these topics.    

Complacency and
Accumulated residual risks that erode the safety margin


We have pointed out how organizations, especially high reliability organizations, strive to maintain mindfulness and combat complacency.  Complacency leads to hubris (“It can’t happen here”) and opens the door for the drift toward failure that occurs with normalization of deviance, constant environmental adaptations, “normal” system performance excursions, group think and an irreducible tendency for SC to decay over time.

Lack of oversight

This refers to everyone who has the responsibility to provide competent, timely, incisive assessment of an entity’s activities but fails to do so.  Their inaction or incompetence neither reinforces a strong SC nor prods a weak SC to improve. 

DNFSB Hearing with SC Expert Presentations

This was "the first of two hearings the Board will convene to address safety culture at Department of Energy defense nuclear facilities and the Board’s Recommendation 2011–1, Safety Culture at the Waste Treatment and Immobilization Plant."****  This hearing focused on presentations by SC experts: Sonya Haber (an SC consultant to DOE), NRC and NASA.  The experts’ slide presentations and a video of the hearing are available here.

Haber hit the right buttons in her presentation but neither she nor anyone else mentioned her DOE client's failure to date to integrate the SC assessments and self-assessments DOE initiated at various facilities in response to Recommendation 2011-1.  We still don’t know whether WTP SC problems exist elsewhere in the DOE complex.  We commented on the DOE’s response to 2011-1 on January 25, 2013 and March 31, 2014.

Winokur asked Haber about the NRC's "safety first" view vs. the DOE's "mission/safety balance."  The question suggests he may be thinking the "balance" perspective gives the DOE entities too much wiggle room to short change safety in the name of mission.

The NRC presenter was Stephanie Morrow.  Her slides recited the familiar story of the evolution of the SC Policy Statement and its integration into the Reactor Oversight Process.  She showed a new figure that summarized NRC’s SC interests in different columns of the ROP action matrix.  Chairman Winokur asked multiple questions about how much direction the NRC gives the licensees in how to perform SC assessments.  The answer was clear: In the NRC’s world, SC is the licensee's responsibility; the NRC looks for adequacy in the consideration of SC factors in problem resolution and SC assessments.  Morrow basically said if DNFSB is too prescriptive, it risks ending up "owning" the facility SC instead of the DOE and facility contractor.

Our Perspective

The Chairman’s presentation addressed SC in a general sense.  However, the reality of the DOE complex is a formidable array of entities that vary widely in scope, scale and missions.  A strong SC is important across the complex but one-size-fits-all approaches probably won’t work.  On the other hand, the custom fit approach, where each entity has flexibility to build its SC on a common DOE policy foundation doesn’t appear to lead to uniformly good results either.  The formal hearing to receive presentations from SC industry experts evidences that the DNFSB is gathering information on what works in other fields.  

Bottom line: The DNFSB is still trying to figure out the correct balance between prescription and flexibility in its effort to bring DOE to heel on the SC issue.  SC is an vital part of the puzzle of how to increase DOE line management effectiveness in ensuring adequate safety performance at DOE facilities.


*  P.S. Winokur, “A User’s Guide to Preventing Major Accidents,” presentation at the 2014 Nuclear Facility Safety Programs Annual Workshop (May 7, 2014).  The workshop was sponsored by the DOE Office of Environment, Health, Safety, and Security.  Thanks to Bill Mullins for bring this presentation to our attention.

**  E. Schein, Organizational Culture and Leadership (San Francisco, CA: Jossey-Bass, 2004), p. 11.

***  N. Taleb, The Black Swan (New York: Random House, 2007), p. 211.

****  DNFSB May 28, 2014 Public Hearing on Safety Culture and Board Recommendation 2011-1.

Monday, November 11, 2013

Engineering a Safer World: Systems Thinking Applied to Safety by Nancy Leveson

In this book* Leveson, an MIT professor, describes a comprehensive approach for designing and operating “safe” organizations based on systems theory.  The book presents the criticisms of traditional incident analysis methods, the principles of system dynamics, and essential safety-related organizational characteristics, including the role of culture, in one place; this review emphasizes those topics.  It should be noted the bulk of the book describes her accident causality model and how to apply it, including extensive case studies; this review does not fully address that material.

Part I
     
Part I sets the stage for a new safety paradigm.  Many contemporary socio-technical systems exhibit, among other characteristics, rapidly changing technology, increasing complexity and coupling, and pressures that put production ahead of safety. (pp. 3-6)   Traditional accident analysis techniques are no longer sufficient.  They too often focus on eliminating failures, esp. component failures or “human error,” instead of concentrating on eliminating hazards. (p. 10)  Some of Leveson's critique of traditional accident analysis echoes Dekker (esp. the shortcomings of Newtonian-Cartesian analysis, reviewed here).**   We devote space to Leveson's criticisms because she provides a legitimate perspective on techniques that comprise some of the nuclear industry's sacred cows.

Event-based models are simply inadequate.  There is subjectivity in selecting both the initiating event (the failure) and the causal chains backwards from it.  The root cause analysis often stops at the first root cause that is familiar, amenable to corrective action, difficult to get beyond (usually the human operator or other human role) or politically acceptable. (pp. 20-24)  Reason's Swiss cheese model is insufficient because of its assumption of direct, linear relationships between components. (pp. 17-19)  In addition, “event-based models are poor at representing systemic accident factors such as structural deficiencies in the organization, management decision making, and flaws in the safety culture of the company or industry.” (p. 28)

Probabilistic Risk Assessment (PRA) studies specified failure modes in ever greater detail but ignores systemic factors.  “Most accidents in well-designed systems involve two or more low-probability events occurring in the worst possible combination.  When people attempt to predict system risk, they explicitly or implicitly multiply events with low probability—assuming independence—and come out with impossibly small numbers, when, in fact, the events are dependent.  This dependence may be related to common systemic factors that do not appear in an event chain.  Machol calls this phenomenon the Titanic coincidence . . . The most dangerous result of using PRA arises from considering only immediate physical failures.” (pp. 34-35)  “. . . current [PRA] methods . . . are not appropriate for systems controlled by software and by humans making cognitively complex decisions, and there is no effective way to incorporate management or organizational factors, such as flaws in the safety culture, . . .” (p. 36) 

The search for operator error (a fall guy who takes the heat off of system designers and managers) and hindsight bias also contribute to the inadequacy of current accident analysis approaches. (p. 38)  In contrast to looking for an individual's “bad” decision, Leveson says “the study of decision making cannot be separated from a simultaneous study of the social context, the value system in which it takes place, and the dynamic work process it is intended to control.” (p. 46) 

Leveson says “Systems are not static. . . . they tend to involve a migration to a state of increasing risk over time.” (p. 51)  Causes include adaptation in response to pressures and the effects of multiple independent decisions. (p. 52)  This is reminiscent of  Hollnagel's warning that cost pressure will eventually push production to the edge of the safety boundary.

When accidents or incidents occur, Leveson proposes that analysis should search for reasons (the Whys) rather than blame (usually defined as Who) and be based on systems theory. (pp. 55-56)  In a systems view, safety is an emergent property, i.e., system safety performance cannot be predicted by analyzing system components. (p. 64)  Some of the goals for a better model include analysis that goes beyond component failures and human errors, is more scientific and less subjective, includes the possibility of system design errors and dysfunctional system interactions, addresses software, focuses on mechanisms and factors that shape human behavior, examines processes and allows for multiple viewpoints in the incident analysis. (pp. 58-60) 

Part II

Part II describes Leveson's proposed accident causality model based on systems theory: STAMP (Systems-Theoretic Accident Model and Processes).  For our purposes we don't need to spend much space on this material.  “The model includes software, organizations, management, human decision-making, and migration of systems over time to states of heightened risk.”***   It attempts to achieve the goals listed at the end of Part I.

STAMP treats safety in a system as a control problem, not a reliability one.  Specifically, the overarching goal “is to control the behavior of the system by enforcing the safety constraints in its design and operation.” (p. 76)  Controls may be physical or social, including cultural.  There is a good discussion of the hierarchy of control in a complex system and the impact of possible system dynamics, e.g., time lags, feedback loops and changes in control structures. (pp. 80-87)  “The process leading up to an accident is described in STAMP in terms of an adaptive feedback function that fails to maintain safety as system performance changes over time to meet a complex set of goals and values.” (p. 90)

Leveson describes problems that can arise from an inaccurate mental model of a system or an inaccurate model displayed by a system.  There is a lengthy, detailed case study that uses STAMP to analyze a tragic incident, in this case a friendly fire accident where a U.S. Army helicopter was shot down by an Air Force plane over Iraq in 1994.

Part III

Part III describes in detail how STAMP can be applied.  There are many useful observations (e.g., problems with mode confusion on pp. 289-94) and detailed examples throughout this section.  Chapter 11 on using a STAMP-based accident analysis illustrates the claimed advantages of  STAMP over traditional accident analysis techniques. 

We will focus on a chapter 13, “Managing Safety and the Safety Culture,” which covers the multiple dimensions of safety management, including safety culture.

Leveson's list of the components of effective safety management is mostly familiar: management commitment and leadership, safety policy, communication, strong safety culture, safety information system, continual learning, education and training. (p. 421)  Two new components need a bit of explanation, a safety control structure and controls on system migration toward higher risk.  The safety control structure assigns specific safety-related responsibilities to management, system designers and operators. (pp. 436-40)  One of the control structure's responsibilities is to identify “the potential reasons for and types of migration toward higher risk need to be identified and controls instituted to prevent it.” (pp. 425-26)  Such an approach should be based on the organization's comprehensive hazards analysis.****

The safety culture discussion is also familiar. (pp. 426-33)  Leveson refers to the Schein model, discusses management's responsibility for establishing the values to be used in decision making, the need for open, non-judgmental communications, the freedom to raise safety questions without fear of reprisal and widespread trust.  In such a culture, Leveson says an early warning system for migration toward states of high risk can be established.  A section on Just Culture is taken directly from Dekker's work.  The risk of complacency, caused by inaccurate risk perception after a long history of success, is highlighted.

Although these management and safety culture contents are generally familiar, what's new is relating them to systems concepts such as control loops and feedback and taking a systems view of the safety control system.

Our Perspective
 

Overall, we like this book.  It is Leveson's magnum opus, 500+ pages of theory, rationale, explanation, examples and infomercial.  The emphasis on the need for a systems perspective and a search for Why accidents/incidents occur (as opposed to What happened or Who is at fault) is consistent with what we've been saying on this blog.  The book explains and supports many of the beliefs we have been promoting on Safetymatters: the shortcomings of traditional (but commonly used) methods of incident investigation; the central role of decision making; and how management commitment, financial and non-financial rewards, and a strong safety culture contribute to system safety performance.
 

However, there are only a few direct references to nuclear.  The examples in the book are mostly from aerospace, aviation, maritime activities and the military.  Establishing a safety control structure is probably easier to accomplish in a new aerospace project than in an existing nuclear organization with a long history (aka memory),  shifting external pressures, and deliberate incremental changes to hardware, software, policies, procedures and programs.  Leveson does mention John Carroll's (her MIT colleague) work at Millstone. (p. 428)  She praises nuclear LER reporting as a mechanism for sharing and learning across the industry. (pp. 406-7)  In our view, LERs should be helpful but they are short on looking at why incidents occur, i.e., most LER analysis does not look at incidents from a systems perspective.  TMI is used to illustrate specific system design/operation problems.
 

We don't agree with the pot shots Leveson takes at High Reliability Organization (HRO) theorists.  First, she accuses HRO of confusing reliability with safety, in other words, an unsafe system can function very reliably. (pp. 7, 12)  But I'm not aware of any HRO work that has been done in an organization that is patently unsafe.  HRO asserts that reliability follows from practices that recognize and contain emerging problems.  She takes another swipe at HRO when she says HRO suggests that, during crises, decision making migrates to frontline workers.  Leveson's problem with that is “the assumption that frontline workers will have the necessary knowledge and judgment to make decisions is not necessarily true.” (p. 44)  Her position may be correct in some cases but as we saw in our review of CAISO, when the system was veering off into new territory, no one had the necessary knowledge and it was up to the operators to cope as best they could.  Finally, she criticizes HRO advice for operators to be on the lookout for “weak signals.”  In her view, “Telling managers and operators to be “mindful of weak signals” simply creates a pretext for blame after a loss event occurs.” (p. 410)  I don't think it's pretext but it is challenging to maintain mindfulness and sense faint signals.  Overall, this appears to be academic posturing and feather fluffing.
 

We offer no opinion on the efficacy of using Leveson's STAMP approach.  She is quick to point out a very real problem in getting organizations to use STAMP: its lack of focus on finding someone/something to blame means it does not help identify subjects for discipline, lawsuits or criminal charges. (p. 86)
 

In Leveson's words, “The book is written for the sophisticated practitioner . . .” (p. xviii)  You don't need to run out and buy this book unless you have a deep interest in accident/incident analysis and/or are willing to invest the time required to determine exactly how STAMP might be applied in your organization.


*  N.G. Leveson, Engineering a Safer World: Systems Thinking Applied to Safety (The MIT Press, Cambridge, MA: 2011)  The link goes to a page where a free pdf version of the book can be downloaded; the pdf cannot be copied or printed.  All quotes in this post were retyped from the original text.


**  We're not saying Dekker or Hollnagel developed their analytic viewpoints ahead of Leveson; we simply reviewed their work earlier.  These authors are all aware of others' publications and contributions.  Leveson includes Dekker in her Acknowledgments and draws from Just Culture: Balancing Safety and Accountability in her text. 

***  Nancy Leveson informal bio page.


****  “A hazard is a system state or set of conditions that, together with a particular set of worst-case environmental conditions, will lead to an accident.” (p. 157)  The hazards analysis identifies all major hazards the system may confront.  Baseline safety requirements follow from the hazards analysis.  Responsibilities are assigned to the safety control structure for ensuring baseline requirements are not violated while allowing changes that do not raise risk.  The identification of system safety constraints allows the possibility of identifying leading indicators for a specific system. (pp. 337-38)

Wednesday, July 24, 2013

Leadership, Culture and Organizational Performance

As discussed in our July 18, 2013 post, INPO's position is that creating and maintaining a healthy safety culture (SC) is a primary leadership responsibility.*  That seems like a common sense belief but is it based on any social science?  What is the connection between leader behavior and culture?  And what is the connection between culture and organizational performance? 

To help us address these questions, we turn to a paper** by some Stanford and UC Berkeley academics.  They review the relevant literature and present their own research and findings.  This paper is not a great fit with nuclear power operations but some of the authors' observations and findings are useful.  One might think there would be ample materials on this important topic but “only a very few studies have actually explored the interrelationships among leadership, culture and performance.” (p. 33)

Leaders and Culture


Leaders can be described by different personality types.  Note this does not focus on specific behavior, e.g., how they make decisions, but the attributes of each personality type certainly imply the kinds of behavior that can reasonably be expected.  The authors contend “. . . the myriad of potential personality and value constructs can be reliably captured by five essential personality constructs, the so-called Big Five or the Five Factor Model . . .” (p. 6)  You have all been exposed to the Big 5, or a similar, taxonomy.  An individual may exhibit attributes from more than one type but can be ultimately be classified as primarily representative of one specific type.  The five types are listed below, with a few selected attributes for each.
  • Agreeableness (Cooperative, Compromising, Compassionate, Trusting)
  • Conscientiousness (Orderly, Reliable, Achievement oriented, Self-disciplined, Deliberate, Cautious)
  • Extraversion (Gregarious, Assertive, Energy, Optimistic)
  • Neuroticism (Negative affect, Anxious, Impulsive, Hostile, Insecure)
  • Openness to Experience (Insightful, Challenge convention, Autonomous, Resourceful)

Leaders can affect culture and later we'll see that some personality types are associated with specific types of organizational culture.  “While not definitive, the evidence suggests that personality as manifested in values and behavior is associated with leadership at the CEO level and that these leader attributes may affect the culture of the organization, although the specific form of these relationships is not clear.” (p. 10)  “. . . senior leaders, because of their salience, responsibility, authority and presumed status, have a disproportionate impact on culture, . . .” (p. 11)

Culture and Organizational Performance

Let's begin with a conclusion: “One of the most important yet least understood questions is how organizational culture relates to organizational performance” (p. 11)

To support their research model, the authors describe a framework, similar to the Big 5 for personality, for summarizing organizational cultures.  The Organizational Culture Profile (OCP) features seven types of culture, listed below with a few selected attributes for each. 

  • Adaptability (Willing to experiment, Taking initiative, Risk taking, Innovative)
  • Collaborative (Team-oriented, Cooperative, Supportive, Low levels of conflict)
  • Customer-oriented (Listening to customers, Being market driven)
  • Detail-oriented (Being precise, Emphasizing quality, Being analytical)
  • Integrity (High ethical standards, Being honest)
  • Results-Oriented (High expectations for performance, Achievement oriented, Not easy going)
  • Transparency (Putting the organization’s goals before the unit, Sharing information freely)
The linkage between culture and performance is fuzzy.  “While the strong intuition was that organizational culture should be directly linked to firm effectiveness, the empirical results are equivocal.” (p. 14)  “[T]he association of culture and performance is not straightforward and likely to be contingent on the firm’s strategy, the degree to which the culture promotes adaptability, and how widely shared and strongly felt the culture is.” (p. 17)  “Further compounding the issue is that the relationship between culture and firm performance has been shown to vary across industries.” (p. 11)  Finally, “although the [OCP] has the advantage of identifying a comprehensive set of cultural dimensions, there is no guarantee that any particular dimension will be relevant for a particular firm.” (p. 18)  I think it's fair to summarize the culture-performance literature by saying “It all depends.” 

Research Results

The authors gathered and analyzed data on a group of high-technology firms: CEO personalities based on the Big 5 types, cultural descriptions using the OCP, and performance data.  Firm performance was based on financial metrics, firm reputation (an intangible asset) and employee attitudes.*** (p. 23-24) 

“[T]he results reveal a number of significant relationships between CEO personality and firm culture, . . . CEOs who were more extraverted (gregarious, assertive, active) had cultures that were more results-oriented. . . . CEOs who were more conscientious (orderly, disciplined, achievement-oriented) had cultures that were more detail-oriented . . . CEOs who were higher on openness to experience (ready to challenge convention, imaginative, willing to try new activities) [were] more likely to have cultures that emphasized adaptability. (p. 26)

“Cultures that were rated as more adaptable, results-oriented and detail-oriented were seen more positively by their employees. Firms that emphasized adaptability and were more detail-oriented were also more admired by industry observers.” (p. 28)

In sum, the linkage between leadership and performance is far from clear.  But “consistent patterns of [CEO] behavior shape interpretations of what’s important [values] and how to behave. . . . Other research has shown that a CEO’s personality may affect choices of strategy and structure.” (p. 31)

Relevance to Nuclear Operations


As mentioned in the introduction, this paper is not a great fit with the nuclear industry.  The authors' research focuses on high technologically companies, there is nothing SC-specific and their financial performance metrics (more important to firms in highly competitive industries) are more robust than their non-financial measures.  Safety performance is not mentioned.

But their framework stimulates us to ask important questions.  For example, based on the research results, what type of CNO would you select for a plant with safety performance problems?  How about one facing significant economic challenges?  Or one where things are running smoothly?  Based on the OCP, what types of culture would be most supportive of a strong SC?  Would any types be inconsistent with a strong SC?  How would you categorize your organization's culture?  

The authors suggest that “Senior leaders may want to consider developing the behaviors that cultivate the most useful culture for their firm, even if these behaviors do not come naturally to them.” (p. 35)  Is that desirable or practical for your CNO?

The biggest challenge to obtaining generalizable results, which the authors recognize, is that so many driving factors are situation-specific, i.e., dependent on a firm's industry, competitive position and relative performance.  They also recognize a possible weakness in linear causality, i.e., the leadership → culture → performance logic may not be one-way.  In our systems view, we'd say there are likely feedback loops, two-way influence flows and additional relevant variables in the overall model of the organization.

The linear (Newtonian) viewpoint promoted by INPO suggests that culture is mostly (solely?) created by senior executives.  If only it were that easy.  Such a view “runs counter to the idea that culture is a social construct created by many individuals and their behavioral patterns.” (p. 10)  We believe culture, including SC, is an emergent organizational property created by the integration of top-down activities with organizational history, long-serving employees, and strongly held beliefs and values, including the organization's “real” priorities.  In other words, SC is a result of the functioning over time of the socio-technical system.  In our view, a CNO can heavily influence, but not unilaterally define, organizational culture including SC.



*  As another example of INPO's position, a recent presentation by an INPO staffer ends with an Ed Schein quote: “...the only thing of real importance that leaders do is to create and manage culture...”  The quote is from Schein's Organizational Culture and Leadership (San Francisco, CA: Jossey-Bass, 1985), p. 2.  The presentation was A. Daniels, “How to Continuously Improve Cultural Traits for the Management of Safety,” IAEA International Experts’ Meeting on Human and Organizational Factors in Nuclear Safety in the Light of the Accident at the Fukushima Daiichi Nuclear Power Plant, Vienna May 21-24, 2013.
 

**  C. O’Reilly, D. Caldwell, J. Chatman and B. Doerr, “The Promise and Problems of Organizational Culture: CEO Personality, Culture, and Firm Performance”  Working paper (2012).  Retrieved July 22, 2013.  To enhance readability, in-line citations have been removed from quotes.

***  The authors report “Several studies show that culture is associated with employee attitudes . . . ” (p. 14)

Wednesday, May 15, 2013

IAEA on Instituting Regulation of Licensee Safety Culture

The International Atomic Energy Agency (IAEA) has published a how-to report* for regulators who want to regulate their licensees' safety culture (SC).  This publication follows a series of meetings and workshops, some of which we have discussed (here and here).  The report is related to IAEA projects conducted “under the scope of the Regional Excellence Programme on Safe Nuclear Energy–Norwegian Cooperation Programme with Bulgaria and Romania. These projects have been implemented at the Bulgarian and Romanian regulatory bodies” (p. 1)

The report covers SC fundamentals, regulatory oversight features, SC assessment approaches, data collection and analysis.  We'll review the contents, highlighting IAEA's important points, then provide our perspective.

SC fundamentals

The report begins with the fundamentals of SC, starting with Schein's definition of SC and his tri-level model of artifacts, espoused values and basic assumptions.  Detail is added with a SC framework based on IAEA's five SC characteristics:

  • Safety is a clearly recognized value
  • Leadership for safety is clear
  • Accountability for safety is clear
  • Safety is integrated into all activities
  • Safety is learning driven.
The SC characteristics can be described using specific attributes.

Features of regulatory oversight of SC 


This covers what the regulator should be trying to achieve.  It's the most important part of the report so we excerpt the IAEA's words.

“The objective of the regulatory oversight of safety culture, focused on a dynamic process, is to consider and address latent conditions that could lead to potential safety performance degradation at the licensees’ nuclear installations. . . . Regulatory oversight of safety culture complements compliance-based control [which is limited to looking at artifacts] with proactive control activities. . . . ” (p. 6, emphasis added)

“[R]egulatory oversight of safety culture is based on three pillars:

Common understanding of safety culture. The nature of safety culture is distinct from, and needs to be dealt with in a different manner than a compliance-based control. . . .

Dialogue. . . . dialogue is necessary to share information, ideas and knowledge that is often qualitative. . . .

Continuousness. Safety culture improvement needs continuous engagement of the licensee. Regulatory oversight of safety culture therefore ideally relies on a process during which the regulator continuously influences the engagement of the licensee.” (p. 7)

“With regards to safety culture, the regulatory body should develop general requirements and enforce them in order to ensure the authorized parties have properly considered these requirements. On the other hand, the regulatory body should avoid prescribing detailed level requirements.” (p. 8)  The licensee always has the primary responsibility for safety.

Approaches for assessing SC

Various assessment approaches are currently being used or reviewed by regulatory bodies around the world. These approaches include: self-assessments, independent assessments, interaction with the licensee at a senior level, focused safety culture on-site reviews, oversight of management systems and integration into regulatory activities.  Most of these activities are familiar to our readers but a couple merit further definition.  The “management system” is the practices, procedures and people.**  “Integration into regulatory activities” means SC-related information is also collected during other regulatory actions, e.g., routine or special inspections.

The report includes a table (recreated below) summarizing, for each assessment approach, the accuracy of results and resources required.  Accuracy is judged as realistic, medium or limited and resource requirements as high, medium and low.  The table thus shows the relative strengths and weaknesses of each approach.





Criteria

Approaches Accuracy of SC picture Effort Management involvement Human and Organizational Factors & SC skills





Self-assessment Medium Low (depending on Low Medium
Review
who initiates the
(to understand
(high experience and
self-assessment,
deliverables)
skills of the
regulator or

reviewers are
licensee)

assumed)








Independent Medium Low Low Medium
assessment Review


(to understand
(high experience and


deliverables)
skills of the



reviewers are



assumed)








Interaction with the Limited (however Medium High Medium
Licensee at Senior can support a


Level shared



understanding)







Focused Safety Realistic (gives High Medium High
Culture On-Site depth in a moment


Review of time)







Oversight of Medium (Reduced Low Low Medium
Management System if only formal


Implementation aspects are



considered)







Integration into Medium (when Medium (after an Medium (with an Medium (specific
Regulatory properly trended intensive initial intensive initial training
Activities and analyzed) introduction) support) requirement and




experience sharing)




Data collection, analysis and presenting findings to the licensee

The report encourages regulators to use multiple assessment approaches and multiple data collection methods and data sources.  Data collection methods include observations; interviews; reviews of events, licensee documents and regulator documents; discussions with management; and other sources such as questionnaires, surveys, third-party documents and focus groups.  The goal is to approach the target from multiple angles.  “The aim of data analysis is to build a safety culture picture based on the inputs collected. . . . It is a set of interpreted data regarding the organizational practices and the priority of safety within these practices. (p. 17)

Robust data analysis “requires iterations [and] multi-disciplinary teams. A variety of expertise (technical, human and organizational factors, regulations) are necessary to build a reliable safety culture picture. . . . [and] protect against bias inherent to the multiple sources of data.” (p. 17)

The regulator's picture of SC is discussed with the licensee during periodic or ad hoc meetings.  The objective is to reach agreement on next steps, including the implementation of possible meeting actions and licensee commitments.

Our perspective

The SC content is pretty basic stuff, with zero new insight.  From our viewpoint, the far more interesting issue is the extension of regulatory authority into an admittedly soft, qualitative area.  This issue highlights the fact that the scope of regulatory authority is established by decisions that have socio-political, as well as technical, components.  SC is important, and certainly regulatable.  If a country wants to regulate nuclear SC, then have at it, but there is no hard science that says it is a necessary or even desirable thing to do.

Our big gripe is with the hypocrisy displayed by the NRC which has a SC policy, not a regulation, but in some cases implements all the steps associated with regulatory oversight discussed in this IAEA report (except evaluation of management personnel).  For evidence, look at how they have been pulling Fort Calhoun and Palisades through the wringer.


*  G. Rolina (IAEA), “Regulatory oversight of safety culture in nuclear installations” IAEA TECDOC 1707 (Vienna: International Atomic Energy Agency, 2013).

**  A management system is a “set of interrelated or interacting elements (system) for establishing policies and objectives and enabling the objectives to be achieved in an efficient and effective way. . . . These elements include the structure, resources and processes. Personnel, equipment and organizational culture as well as the documented policies and processes are parts of the management system.” (p. 30)