Friday, March 30, 2012

The Safety Culture Common Language Path Forward—and the Broken Window at the Nuclear Power Plant

The NRC has an initiative, the Safety Culture Common Language Path Forward, to describe safety culture (SC) elements at a more detailed level than the NRC’s SC policy statement.  There was a workshop for agency and industry representatives, and a conference call was scheduled for today to discuss next steps.  The draft “elements” from the workshop are a mélange of polished bureaucratese, company policy proclamations and management homilies.*

I was curious to see how they treated the areas I have been harping on as critical for effective SC: decision making, corrective action, management incentives and work backlogs.  Following is my very subjective rating of how well the draft elements cover the key subject areas.

Decision making – Mostly Satisfactory.  “The licensee makes safety-significant or risk-significant decisions using a systematic process.” (Shoop, p. 18)   We agree; in fact, we think ALL significant decisions should be made using a systematic process.  Why “systematic”?  To evidence transparency and robustness, i.e., to maximize the odds that a different decision-maker, if faced with a similar situation, will reach the same or similar answer.  However, one important type of decision, the resolution of goal conflict needs improvement.  Goal conflict appears focused on personal or professional disagreements; the big picture potential conflict of safety vs production, cost or schedule gets slight mention.

Corrective action – Satisfactory.  There are a lot of words about corrective action and the CAP and they cover the important points.  A minor gripe is the term “safety” may be overused when referring to identifying, evaluating or correcting problems.  A couple of possible unintended consequences of such overuse are to create the impression that (1) only safety-related problems need such thorough treatment or (2) anything someone wants done needs some relation, no matter how tenuous, to safety.

Management incentives – Minimally acceptable.  “Senior management incentive program [sic] reflect a bias toward long-term plant performance and safety.” (Shoop, p. 12)  One could say more about this topic (and we have, including here and here) but the statement gets over the bar. 

Backlogs – Unsatisfactory.  The single mention of backlogs is “Maintaining long term plant safety by . . . ensuring maintenance and engineering backlogs which are low enough [to] support safety” (Shoop, p. 32) and even that was the tail end of a list of contributing factors to plant safety.  Backlogs are much more important than that.  Excessive backlogs are demoralizing; they tell the workforce that accomplishing work to keep the plant, its procedures and its support processes in good repair or up-to-date is not important.  Every “problem plant” we worked on in the late 1990s had backlog issues.  This is where the title reference to the broken window comes in. 

“. . . if a window in a building is broken and is left unrepaired, all the rest of the windows will soon be broken. . . . one unrepaired broken window is a signal that no one cares, and so breaking more windows costs nothing.”**

Excessive backlogs are a broken window. 


*  NRC memo from U.S. Shoop to J. Giitter, “Safety Culture Common Language Path Forward” (Mar. 19, 2012) ADAMS ML12072A415.

**  J.Q. Wilson and G.L. Kelling, “Broken Windows: The police and neighborhood safety,” The Atlantic Monthly (Mar. 1982).

1 comment:

  1. Bob,

    I think the search along your four vectors highlights a crucial point. In any producing enterprise, none of those four vectors is severable into a "safety" and "non-safety" component.

    The point you make about fuzziness regarding resolution of goal conflicts has as much to do with chronic input fuzziness as it has to do with decision maker competence or "single-point" accountability.

    And the single-minded focus of attitudes on the protection of design intent ignores the reality that budget decisions, tradeoff regarding modifications, staffing development and such are mostly about inferring from limited data what conditions may prevail years into the future.

    You noted: "A minor gripe is the term “safety” may be overused when referring to identifying, evaluating or correcting problems."

    I would suggest that you stopped short in the scope of your gripe. There is an over-whelming sense of the "Do Safety, Work when Safe Enough (Maybe)" ethos that I've come to term the Duty of Prevention.

    There are the makings of imprudent whole risk-reckoning when you couple that oversight preoccupation with this: "Nuclear is Recognized as Special and Unique – The special characteristics of nuclear technology are taken into account in all decisions and actions."

    Any in depth look at the Deepwater Horizon saga will suggest that the challenges of staying on top of Complex, High-Consequence Circumstances are by no means unique to nuclear power generation.

    Having looked at this latest effort to "crowd source" Nuclear Safety Culture into existence without a shred of basis in culture research, I continue to conclude this is a foolish exercise with substantial downside potential and little upside.

    It all makes about as much sense as telling a seaman - When in doubt sail to Port."

    ReplyDelete

Thanks for your comment. We read them all. We would like to display them under their respective posts on our main page but that is not how Blogger works.