Here’s a trip down memory lane. Back in 2002 a report* on the “state of the art” in safety culture (SC) thinking, research and regulation was prepared for the NRC Advisory Committee on Reactor Safeguards. This post looks at some of the major observations of the 2002 report and compares them with what we believe is important today.
The report’s Abstract provides a clear summary of the report’s perspective: “There is a widespread belief that safety culture is an important contributor to the safety of operations. . . . The commonly accepted attributes of safety culture include good organizational communication, good organizational learning, and senior management commitment to safety. . . . The role of regulatory bodies in fostering strong safety cultures remains unclear, and additional work is required to define the essential attributes of safety culture and to identify reliable performance indicators.” (p. iii)
General Observations on Safety Performance
A couple of quotes included in the report reflect views on how safety performance is managed or influenced.
“"The traditional approach to safety . . . has been retrospective, built on precedents. Because it is necessary, it is easy to think it is sufficient. It involves, first, a search for the primary (or "root") cause of a specific accident, a decision on whether the cause was an unsafe act or an unsafe condition, and finally the supposed prevention of a recurrence by devising a regulation if an unsafe act,** or a technical solution if an unsafe condition." . . . [This approach] has serious shortcomings. Specifically, ". . . resources are diverted to prevent the accident that has happened rather than the one most likely to happen."” (p. 24)
“"There has been little direct research on the organizational factors that make for a good safety culture. However, there is an extensive literature if we make the indirect assumption that a relatively low accident plant must have a relatively good safety culture." The proponents of safety culture as a determinant of operational safety in the nuclear power industry rely, at least to some degree, on that indirect assumption.” (p. 37)
Plenty of people today behave in accordance with the first observation and believe (or act as if they believe) the second one. Both contribute to the nuclear industry’s unwillingness to consider new ways of thinking about how safe performance actually occurs.
Decision Making, Goal Conflict and the Reward System
Decision making processes, recognition of goal conflicts and an organization’s reward system are important aspects of SC and the report addressed them to varying degrees.
One author referenced had a contemporary view of decision making, noting that “in complex and ill-structured risk situations, decisionmakers are faced not only with the matter of risk, but also with fundamental uncertainty characterized by incompleteness of knowledge.” (p. 43) That’s true in great tragedies like Fukushima and lesser unfortunate outcomes like the San Onofre steam generators.
Goal conflict was mentioned: “Managers should take opportunities to show that they will put safety concerns ahead of power production if circumstances warrant.” (p.7)
Rewards should promote good safety practices (p. 6) and be provided for identifying safety issues. (p. 37) However, there is no mention of the executive compensation system. As we have argued ad nauseam these systems often pay more for production than for safety.
The Role of the Regulator
“The regulatory dilemma is that the elements that are important to safety culture are difficult, if not impossible, to separate from the management of the organization. [However,] historically, the NRC has been reluctant to regulate management functions in any direct way.” (pp. 37-38) “Rather, the NRC " . . . infers licensee organization management performance based on a comprehensive review of inspection findings, licensee amendments, event reports, enforcement history, and performance indicators."” (p. 41) From this starting point, we now have the current situation where the NRC has promulgated its SC Policy Statement and practices de facto SC regulation using the highly reliable “bring me another rock” method.
The Importance of Context when Errors Occur
There are hints of modern thinking in the report. It contains an extended summary of Reason’s work in Human Error. The role of latent conditions, human error as consequence instead of cause, the obvious interaction between producers and production, and the “non-event” of safe operations are all mentioned. (p. 15) However, a “just culture” or other more nuanced views of the context in which safety performance occurs had yet to be developed.
One author cited described “the paradox that culture can act simultaneously as a precondition for safe operations and an incubator for hazards.” (p. 43) We see that in Reason and also in Hollnagel and Dekker: people going about business as usual with usually successful results but, on some occasions, with unfortunate outcomes.
The report’s author provided a good logic model for getting from SC attributes to identifying useful risk metrics, i.e., from SC to one or more probabilistic risk assessment (PRA) parameters. (pp. 18-20) But none of the research reviewed completed all the steps in the model. (p. 36) He concludes “What is not clear is the mechanism by which attitudes, or safety culture, affect the safety of operations.” (p. 43) We are still talking about that mechanism today.
But some things have changed. For example, probabilistic thinking has achieved greater penetration and is no longer the sole province of the PRA types. It’s accepted that Black Swans can occur (but not at our plant).
Bottom line: Every student of SC should take a look at this. It includes a good survey of 20th century SC-related research in the nuclear industry and it’s part of our basic history.
“Those who cannot remember the past are condemned to repeat it.” — George Santayana (1863-1952)
* J.N. Sorensen, “Safety Culture: A Survey of the State-of-the-Art,” NUREG-1756 (Jan. 2002). ADAMS ML020520006. (Disclosure: I worked alongside the author on a major nuclear power plant litigation project in the 1980s. He was thoughtful and thorough, qualities that are apparent in this report.)
** We would add “or reinforcing an existing regulation through stronger procedures, training or oversight.”