Thursday, February 28, 2013

The Safety Culture Common Language Path Forward (Update)

The intent of the NRC's Safety Culture Common Language Path Forward initiative is to describe safety culture (SC) attributes at a more detailed level than the NRC’s SC policy statement.  On January 29-30, 2013 the NRC held a public workshop to finalize the draft SC common language.*  The document they issued after the workshop** contains attribute definitions and examples of behavior and artifacts that support or embody each attribute.  This document will be used by the NRC to develop a NUREG containing the final common language.

Last March we posted on a draft produced by previous workshops, focusing on areas we consider critical for a strong SC: decision making, corrective action, management incentives and work backlogs.  In that post, our opinion was that decision making and corrective action were addressed in a satisfactory manner, the treatment of incentives was minimally acceptable and backlogs were all but ignored.

So, how does the “final” language treat the same subject areas?  Is it better than the draft comments we reviewed last March?  The arrows indicate whether the final version is better
, the same → or worse ↓.

Decision making – Good.  Decision making incorporates “. . . a consistent, systematic approach to make decisions” (p. 51) and a conservative bias, i.e., “. . . decision-making practices that emphasize prudent choices over those that are simply allowable. A proposed action is determined to be safe in order to proceed, rather than unsafe in order to stop.” (p. 52)  In addition, communicating, explaining and justifying individual decisions is mentioned throughout the document. 

Goal conflict is addressed under leader behavior “. . . when resolving apparent conflicts between nuclear safety and production” (p. 12) and leaders “avoid unintended or conflicting messages that may be conveyed by operational decisions” (p. 37); work process “activities are coordinated to address conflicting or changing priorities.” (p. 23) 

Corrective action – Satisfactory
.  The section on problem identification and resolution (pp. 13-17) is suffused with desirable characteristics of corrective actions and the CAP.  A good CAP has a low threshold for identifying issues and problems are thoroughly evaluated.  Corrective actions are timely, effective and prevent recurrence of problems.  Periodic analysis of CAP and other data is used to identify any programmatic or common cause issues.

Management incentives – Unsatisfactory.  The section on incentives appears to focus on workers, not managers: “Leaders ensure incentives, sanctions, and rewards are aligned with nuclear safety policies and reinforce behaviors and outcomes which reflect safety as the overriding priority.” (p. 7)  This is even less complete than the single sentence that appeared in last year's draft: “Senior management incentive program [sic] reflect a bias toward long-term plant performance and safety.”*** The failure to mention the senior management incentive program is a serious shortcoming.

Backlogs – Minimally Acceptable.  Backlogs are specifically mentioned in maintenance and engineering (p. 24) and document changes (p. 25).  In addition, problem evaluation, corrective actions, CAP trending analyses, operating experience lessons and many administrative activities are supposed to be addressed in a “timely” manner.  I hope that implies that backlogs in these areas should not be too large.     

But attention to backlogs is still important.  Repeating what we said last year, “Excessive backlogs are demoralizing; they tell the workforce that accomplishing work to keep the plant, its procedures and its support processes in good repair or up-to-date is not important.  Every “problem plant” we worked on in the late 1990s had backlog issues.”

Conclusion

Overall, this latest document is an improvement over the March 2012 version but still short of what we'd like to see.


*  M.J. Keefe (NRC) to U.S. Shoop (NRC), “Summary of the January 29-30, Workshop to Develop Common Language for Safety Culture” (Feb. 7, 2013)  ADAMS ML13038A059.

**  Nuclear Safety Culture Common Language 4th Public Workshop January 29-31, 2013  ADAMS ML13031A343.

***  U.S. Shoop (NRC) to J. Giitter (NRC), “Safety Culture Common Language Path Forward” (Mar. 19, 2012) p. 12.  ADAMS ML12072A415.

Friday, February 22, 2013

Personal and Organizational Habits: A Threat to Safety Culture?

A book I received as a gift got me thinking about habits: The Power of Habit: Why We Do What We Do in Life and Business by Charles Duhigg.*  Following is a summary of selected points that we can relate to safety culture (SC) and our assessment of the book's usefulness for SC aficionados

Habits are automatic activity sequences people exhibit when they perceive specific triggering cues in the environment.  Habit behavior is learned, and directed toward achieving some reward, which may be physical or psychological.  The brain creates habits to conserve energy and operate more efficiently; without habits people would be overwhelmed by the countless decisions they would have to make to complete the most mundane tasks, e.g., driving to work.

People use habits at work to increase their productivity and get things done.  Unfortunately, habits can allow potential safety threats to slip through the cracks.  How?  Because while Rational Man considers all available alternatives before making a decision, and Satisficing Man consciously picks the first alternative that looks good enough, Habit Man is carrying out his behavior more or less unconsciously.  If the work environment contains weak signals of nascent problems or external environmental threats, then people following their work habits are not likely to pick up such signals.  Bad work habits may be the handmaiden of complacency.

Organizations also have habits (sometimes called routines).  Routines are important because, without them, it would be much more difficult to get work accomplished.  Routines reduce uncertainty throughout the organization and create truces between competing groups and individuals.  Some routines are the result of decisions made long ago, others evolve organically.  They are so embedded in the organization that no one questions them.**

Duhigg includes many case studies involving individuals and organizations.  One organizational case study is worth repeating because it focuses on changing safety habits.

When Paul O'Neill*** became Alcoa CEO in 1987 he made improving worker safety his first initiative.  He believed the habits that led to safety were keystone habits and if they could be changed (improved) then other business routines would follow.  In this case, he was correct.  Proper work routines are also the safest ones; over time quality and productivity improved and the stock price rose.  The new routines resulted in new values, e.g., intolerance for unsafe practices,  becoming ingrained in the culture.

The bottom line

I'd put this book in the self-help category—the strongest sections focus on individuals, how they can be crippled by bad habits, and how they can change those habits.  With the exception of the Alcoa case, this book is not really about SC so I'm not recommending it for our readers but it does stimulate thought about the role of unconscious habits and routines in reinforcing a strong SC, or facilitating its decay.  If work habits or routines become frozen and cannot (or will not) adjust to changes in the external or task environment, then performance problems will almost surely arise.      


*  C. Duhigg,  The Power of Habit: Why We Do What We Do in Life and Business (New York: Random House 2012).  To simplify this post and focus on a linkage to SC, many of the book's concepts are not mentioned in the main text above.  For example, when the brain links the reward back to the cue, it creates a neurological craving; the stronger the craving, the more likely the cue will trigger the activities that lead to the reward.  Bad habits can be changed by inserting a new activity routine between the cue and the reward.  A belief that change is possible is needed before people will attempt to change their habits; willpower and self-discipline are necessary for changes to stick.  A real (or manufactured) crisis can make organizational routines amenable to change.

**  The result can be the worst kind of machine bureaucracy: rigid hierarchies, organizational silos, narrow employee responsibilities, and no information shared or questions asked.

***  O'Neill later served as U.S. Treasury Secretary during 2001-2002.

Sunday, February 10, 2013

Safety Culture - Lessons from the Social Science Literature

In 2011 the NRC contracted with the Pacific Northwest National Laboratory to conduct a review of social science literature related to safety culture (SC) and methods for evaluating interventions proposed to address issues identified during SC assessments.  The resultant report* describes how traits such as leadership, trust, respect, accountability, and continuous learning are discussed in the literature. 

The report is heavily academic but not impenetrable and a good reference work on organizational culture theory and research.  I stumbled on this report in ADAMS and don't know why it hasn't had wider distribution.  Perhaps it's seen as too complicated or, more importantly, doesn't exactly square with the NRC/NEI/industry Weltanschauung when the authors say things like:  

“There is no simple recipe for developing safety culture interventions or for assessing the likelihood that these interventions will have the desired effects.” (p. 2)

“The literature consistently emphasizes that effecting directed behavioral, cognitive, or cultural change in adults and within established organizations is challenging and difficult, requires persistence and energy, and is frequently unsuccessful.” (p. 7)

This report contains an extensive review of the literature and it is impossible to summarize in a blog post.  We'll provide an overview of the content, focusing on interesting quotes and highlights, then revisit Schein's model and close with our two cents worth.

Concept of safety culture

This section begins with the definition of SC and the nine associated traits in the NRC SC policy statement, and compares them with other organizations' (IAEA, NEI, DOE et al) efforts. 

The Schein model is proposed as a way to understand “why things are as they are” as a starting point upon which to build change strategies aimed at improving organizational performance.  An alternative approach is to define the characteristics of an ideal SC, then evaluate how much the target organization differs from the ideal, and use closing the gap as the objective for corrective strategies.  The NEI approach to SC assessment reflects the second conceptual model.  A third approach, said to bridge the difference between the first two, is proposed by holistic thinkers such as Reason who focus on overall organizational culture. 

This is not the usual “distinction without a difference” argument that academics often wage.  Schein's objective is to improve organizational performance; the idealists' objective is to make an organization correspond to the ideal model with an assumption that desired performance will follow. 

The authors eventually settle on the high reliability organization (HRO) literature as providing the best basis for linking individual and organizational assumptions with traits and mechanisms for affecting safety performance.  Why?  The authors say the HRO approach identifies some of the specific mechanisms that link elements of a culture to safety outcomes and identifies important relationships among the cultural elements. (p. 15)  A contrary explanation is that the authors wanted to finesse their observation that Schein (beloved by NRC) and NEI have different views of the the basis that should be used for designing SC improvement initiatives.

Building blocks of culture 


The authors review the “building blocks” of culture, highlighting areas that correspond to the NRC safety culture traits.  If an organization wants to change its culture, it needs to decide which building blocks to address and how to make and sustain changes.

Organizational characteristics that correspond to NRC SC traits include leadership, communication, work processes, and problem identification and resolution.  Leadership and communication are recognized as important in the literature and are discussed at length.  However, the literature review offered thin gruel in the areas of work processes, and problem identification and resolution; in other words, the connections between these traits and SC are not well-defined. (pp. 20-25)

There is an extensive discussion of other building blocks including perceptions, values, attitudes, norms**, beliefs, motivations, trust, accountability and respect.  Implications for SC assessment and interventions are described, where available.  Adaptive processes such as sense making and double-loop learning are also mentioned.

Change and change management

The authors review theories of individual and organizational change and change management.  They note that planned interventions need to consider other changes that may be occurring because of dynamic processes between the organization and its environment and within the organization itself.

Many different models for understanding and effecting organizational change are described.  As the authors summarize: “. . . change is variously seen as either pushed by problems or pulled by visions or goals; as purposive and volitional or inadvertent and emergent; as a one-time event or a continuous process. It is never seen as easy or simple.” (p. 43)

The authors favor Montaño and Kaspryzk’s Integrated Behavioral Model, shown in the figure below, as a template for designing and evaluating SC interventions.  It's may be hard to read here but suffice to say a lot of factors go into an individual's decision to perform a new behavior and most or all of these factors should be considered by architects of SC interventions.  Leadership can provide input to many of these factors (through communication, modeling desired behavior, including decision making) and thus facilitate (or impede) desired behavioral changes.



From Montaño and Kaspryzk
Resistance to change can be wide-spread.  Effective leadership is critical to overcoming resistance and implementing successful cultural changes.  “. . . leaders in formal organizations have the power and responsibility to set strategy and direction, align people and resources, motivate and inspire people, and ensure that problems are identified and solved in a timely manner.” (p. 54)

Lessons from initiatives to create other specific organizational cultures

The authors review the literature on learning organizations, total quality management and quality organizations, and sustainable organizations for lessons applicable to SC initiatives.  They observe that this literature “is quite consistent in emphasizing the importance of recognizing that organizations are multi-level, dynamic systems whose elements are related in complex and multi-faceted ways, and that culture mirrors this dynamic complexity, despite its role in socializing individuals, maintaining stability, and resisting change.” (p. 61)

“The studies conducted on learning, quality, and sustainable organizations and their corresponding cultures contain some badly needed information about the relationship among various traits, organizational characteristics, and behaviors that could help inform the assessment of safety cultures and the design and evaluation of interventions.” (p. 65)  Topics mentioned include management leadership and commitment, trust, respect, shared vision and goals, and a supportive learning environment.

Designing and evaluating targeted interventions 


This section emphasizes the potential value of the evaluation science*** approach (used primarily in health care) for the nuclear industry.  The authors go through the specific steps for implementing the evaluation science model, drilling down in spots to describe additional tools, such as logic modeling (to organize and visualize issues, interventions and expected outcomes), that can be used.  There is a lot of detail here including suggestions for how the NRC might use backward mapping and a review of licensee logic models to evaluate SC assessment and intervention efforts.  Before anyone runs off to implement this approach, there is a major caveat:

“The literature on the design, implementation, and evaluation of interventions to address identified shortcomings in an organization’s safety culture is sparse; there is more focus on creating a safety culture than on intervening to correct identified problems.” (p. 67)

Relation to Schein

Schein's model of culture (shown on p. 8) and prescriptions for interventions are the construct most widely known to the nuclear industry and its SC practitioners.  His work is mentioned throughout the PNNL report.  Schein assumes that cultural change is a top-down effort (so leadership plays a key role) focused on individuals.  Change is implemented using an unfreeze—replace/move—refreeze strategy.  Schein's model is recommended in the program theory-driven evaluation science approach.  The authors believe Schein's “description of organizational culture and change does one of the best jobs of conveying the “cultural” dimensions in a way that conveys its embeddedness and complexity.” (p. 108)  The authors note that Schein's cultural levels interact in complex ways, requiring a systems approach that relates the levels to each other, SC to the larger organizational culture, and culture to overall organizational functioning.

So if you're acquainted with Schein you've got solid underpinnings for reading this report even if you've never heard of any of the over 300 principal authors (plus public agencies and private entities) mentioned therein.  If you want an introduction to Schein, we have posted on his work here and here.

Conclusion

This is a comprehensive and generally readable reference work.  SC practitioners should read the executive summary and skim the rest to get a feel for the incredible number of theorists, researchers and institutions who are interested in organizational culture in general and/or SC in particular.  The report will tell you what a culture consists of and how you might go about changing it.

We have a few quibbles.  For example, there are many references to systems but very little to what we call systems thinking (an exception is Senge's mention of systems thinking on p. 58 and systems approach on p. 59).  There is no recognition of the importance of feedback loops.

The report refers multiple times to the dynamic interaction of the factors that comprise a SC but does not provide any model of those interactions.  There is limited connectivity between potentially successful interventions and desired changes in observable artifacts.  In other words, this literature review will not tell you how to improve your plant's decision making process or corrective action program, resolve goal conflicts or competing priorities, align management incentives with safety performance, or reduce your backlogs.


*  K.M. Branch and J.L. Olson, “Review of the Literature Pertinent to the Evaluation of Safety Culture Interventions” (Richland, WA: Pacific Northwest National Laboratory, Dec. 2011).  ADAMS ML13023A054

**  The authors note “The NRC safety culture traits could also be characterized as social norms.” (p. 28)

***  “. . . evaluation science focuses on helping stakeholders diagnose organization and social needs, design interventions, monitor intervention implementation, and design and implement an evaluation process to measure and assess the intended and unintended consequences that result as the intervention is implemented.” (p. 69)

Wednesday, January 30, 2013

Talking Sheep at Palisades

In Lewis Carroll’s Through the Looking Glass, Alice and the White Queen advance into the chessboard's fifth rank by crossing over a brook together, but at the very moment of the crossing, the Queen transforms into a talking sheep.  Alice soon finds herself struggling to handle the oars of a small rowboat, where the Sheep annoys her with nonsensical shouting.  Now consider the NRC’s Nov. 9, 2012 followup inspection report* at Palisades related to the DC panel event and the Service Water pump coupling failure.  It brings to mind a similar picture - in this case inspectors struggling to propel a small rowboat of substance on a river of nonsensical jargon and bureaucratese.

Reading this inspection report (IR) reveals endless repetition of process details and findings of other reports, and astonishingly little substance or basis for the inspectors' current findings and conclusions.  The IR “assesses” the findings of the Palisades root cause analysis and associated extent of condition and corrective actions.  The discussion is deeply ingrained with yellow findings, white findings, crosscutting this and cornerstone that, a liberal dose of safety culture traits and lots of significance determinations.  Frankly it’s hard to even remember what started the whole thing.  Perhaps of most interest, the IR notes  that much of the Palisades management team was replaced in the period since these two events.
(p. 23)  Why?  Were they deemed incompetent? Unwilling to implement appropriate risk and safety priorities?  Or just sacrificial lambs? (more sheep).  It appears that these changes carried significant weight with the NRC inspectors although it is not specifically stated. 

Then there is this set of observations:

“During interviews the inspectors heard that there were concerns about staffing levels in multiple departments, but the site was aware and was actively working with Entergy corporate management to post and fill positions. . . Entergy Corporate was perceived by many on the site to be stifling progress in filling positions.  The many issues at Palisades and staffing problems have contributed to the organization becoming more reactive to addressing maintenance and equipment reliability issues versus being proactive in addressing possible problems.” (p. 23)

Which is it?  The site was actively working with Entergy or Entergy was stifling progress in filling positions?  Without further amplification or justification the IR delivers its conclusion: “The inspection team concluded the safety culture was adequate and improving.” (p. 24, emphasis added)  There is no discussion of how or on what basis the inspectors reached this conclusion.  In particular the finding of “improving” is hard to understand as it does not appear that this inspection team had previously assessed the safety culture at the site.

At one point the IR stumbles into a revealing and substantive issue that could provide significant insight into the problems at Palisades.  It describes another event at the plant with a lot of similarities to the DC panel. 

“The inspection team focused inspection efforts on ... an occurrence when, on May 14, 2012, workers erroneously placed a wire jumper between 115 Volt AC and 125 Volt DC circuits ...many of the actions and behaviors exhibited by the workers involved were similar in nature to the loss of DC bus event that occurred in September 2011...Those similar behaviors included the lack of a pre-job brief and discussion regarding the limitations of the work scope, workers taking action outside of the scope allowed by ‘toolpouch maintenance,’ supervisors failing to adequately challenge the workers, and workers proceeding in the face of uncertainty when unexpected conditions arose.” (p. 21)

So far so good.

“Many of the supervisors and managers the inspection team interviewed stated that the May 2012 near-miss was not a repeat event of the September 2011 event because the May 2012 near-miss involved only a handful of individuals, whereas the September 2011 occurrence involved multiple individuals across multiple organizations at Palisades. The inspectors agreed that the May 2012 near-miss involved fewer individuals, but there were individuals from several organizations involved in the near-miss. The inspectors concluded that the RCE assessment was narrow in that it stated only the field work team failed to internalize the cause and corrective actions from the September 2011 DC bus event. The inspectors concluded that other individuals, including the WCC SRO, CRS, and a non-licensed plant operator also exhibited behaviors similar to those of the September 2011 DC bus event.” (p. 21)

Still good but starting to wonder if the Palisades supervisors and managers really got the lessons learned from September 2011.

“The inspectors determined that, while the May 2012 near-miss shared some commonalities with the September 2011 event, the two conditions were not the result of the same basic causes. The inspectors reached this conclusion because the May 2012 near-miss did not result in a significant plant transient [emphasis added] and also did not exhibit the same site wide, organizational breakdowns in risk recognition and management that led to the September 2011 event.” (pp. 21-22)

Whoops.  First, what is the relevance of the outcome of the May 2012 event?  Why is it being alluded to as a cause?  Are the inspectors saying that if in September 2011 the Palisades personnel took exactly the actions they took but had the good fortune not to let the breaker stab slip it would not be a significant safety event?  

With regard to the extent of organizational breakdown, in the prior paragraph the inspectors had pushed back on this rationale - but now conclude the May 2012 event is different because it was not “site-wide”.  It is not clear how you square these arguments particularly if one goes back to the original root cause of  the DC panel event: 

“...senior leaders had not established a sufficiently sensitive culture of risk recognition and management, which resulted in the plant’s managers, supervisors, and workers not recognizing, accounting for, or preparing for the industrial safety risk and plant operational nuclear risk…” (p. 1) and, quoting from the licensee root cause analysis “site leadership at all levels was not sufficiently intrusive into work on panel ED-11-2.” (p. 13)

It is hard to see how the May 2012 event didn’t exhibit these same causes.  In addition, the “Why Staircase” in the Palisades root cause analysis (p. 21) does not identify or allude to the extent of involvement of multiple organizations - at all.  While we do not believe that such linear, “why” thinking is adequate for a complex system, it is the basis for what Palisades found and what the NRC inspectors accepted.

We’re not really sure what to make of this inspection effort.  On its face it doesn’t provide much of a basis for its conclusion that the safety culture is adequate and improving.  Perhaps the real basis is the new management team?  Or perhaps the NRC doesn’t really have many options in this situation.  If the current inspection found the weaknesses not to have been resolved, what could the NRC do?  Is there such a thing as an “inadequate” safety culture?  Or just safety culture that need improvement?  It seems the NRC’s safety culture construct has created a Looking Glass-like inversion of reality - maybe a convenient trope within the agency but increasingly a baffling and unsatisfying distraction to achieving competent nuclear safety management. 

Bottom line:  The NRC close out inspection is a baaaad report.


*  S. West (NRC) to A. Vitale (Entergy), “Palisades Nuclear Plant - NRC Supplemental Inspection Report 05000255/2012011; and Assessment Follow-up Letter” (Nov. 9, 2012) ADAMS ML12314A304.

Friday, January 25, 2013

Safety Culture Assessments: the Vit Plant vs. Other DOE Facilities

The Vit Plant
 As you recall, the Defense Nuclear Facilities Safety Board (DNFSB) set off a little war with DOE when DNFSB published its blistering June 2011 critique* of the Hanford Waste Treatment Plant's (Vit Plant) safety culture (SC).  Memos were fired back and forth but eventually things settled down.  One of DOE's resultant commitments was to assess SC at other DOE facilities to see if  SC concerns identified at the Vit Plant were also evident elsewhere.  Last month DOE transmitted the results of five assessments to DNFSB.**  The following facilities were evaluated:

• Los Alamos National Laboratory Chemistry and Metallurgy Research Replacement Project (Los Alamos)
• Y-12 National Security Complex Uranium Processing Facility Project (UPF)
• Idaho Cleanup Project Sodium Bearing Waste Treatment Project (Idaho)
• Office of Environmental Management Headquarters (EM)
• Pantex Plant
 


The same protocol was used for each of the assessments: DOE's Health, Safety and Security organization formed a team of its own assessors and two outside experts from the Human Performance Analysis Corporation (HPA).  Multiple data collection tools, including functional analysis, semi-structured focus group and individual interviews, observations and behavioral anchored rating scales, were used to assess organizational behaviors.  The external experts also conducted a SC survey at each site.

A stand-alone report was prepared for each facility, consisting of a summary and recommendation (ca. 5 pages) and the outside experts' report (ca. 25 pages).  The outside experts organized their observations and findings along the nine SC traits identified by the NRC, viz.,

• Leadership Safety Values and Actions
• Problem Identification and Resolution
• Personal Accountability
• Work Processes
• Continuous Learning
• Environment for Raising Concerns
• Effective Safety Communication
• Respectful Work Environment
• Questioning Attitude.

So, do Vit Plant SC concerns exist elsewhere?

That's up to the reader to determine.  The DOE submittal contained no meta-analysis of the five assessments, and no comparison to Vit Plant concerns.  As far as I can tell, the individual assessments made no attempt to focus on whether or not Vit Plant concerns existed at the reviewed facilities.

However, my back-of-the-envelope analysis (no statistics, lots of inference) of the reports suggests there are some Vit Plant issues that exist elsewhere but not to the degree that riled the DNFSB when it looked at the Vit Plant.  I made no effort to distinguish between issues mentioned by federal versus contractor employees, or by different contractors.  Following are the major Vit Plant concerns, distilled from the June 2011 DNFSB letter, and their significance at other facilities.

Schedule and/or budget pressure that can lead to suppressed issues or safety short-cuts
 

This is the most widespread and frequently mentioned concern.  It appears to be a significant issue at the UPF where the experts say “the project is being driven . . . by a production mentality.”  Excessive focus on financial incentives was also raised at UPF.  Some Los Alamos interviewees reported schedule pressure.  So did some folks at Idaho but others said safety was not compromised to make schedule; financial incentives were also mentioned there.  At EM, there were fewer comments on schedule pressure and at Pantex, interviewees opined that management shielded employees from pressure and tried to balance the message that both safety and production are important.

A chilled atmosphere adverse to safety exists

The atmosphere is cool at some other facilities, but it's hard to say the temperature is actually chilly.  There were some examples of perceived retaliation at Los Alamos and Pantex.  (Two Pantex employees reported retaliation for raising a safety concern; that's why Pantex, which was not on the original list of facilities for SC evaluation, was included.)  Fear of retaliation, but not actual examples, was reported at UPF and EM.  Fear of retaliation was also reported at Pantex. 

Technical dissent is suppressed

This is a minor issue.  There were some negative perceptions of the differing professional opinion (DPO) process at Los Alamos.  Some interviewees thought the DPO process at EM could be better utilized.  The experts said DPO needed to be better promoted at Pantex. 

Processes for raising and resolving SC-related questions exist but are neither trusted nor used

Another minor issue.  The experts said the procedures at Los Alamos should be reevaluated and enforced.

Conclusion

I did not read every word of this 155 page report but it appears some facilities have issues akin to those identified at the Vit Plant but their scope and/or intensity generally appear to be less.

The DOE submittal is technically responsive to the DNFSB commitment but is not useful without further analysis.  The submittal evidences more foot dragging by DOE to cover up the likely fact that the Vit Plant's SC problems are more significant than other facilities' and buy time to attempt to correct those problems.


* Defense Nuclear Facilities Safety Board, Recommendation 2011-1 to the Secretary of Energy "Safety Culture at the Waste Treatment and Immobilization Plant" (Jun 9, 2011).  We have posted on the DOE-DNFS imbroglio here, here and here.
   
**  G.S. Podansky (DOE) to P.S. Winokur (DNFSB), letter transmitting five independent safety culture assessments (Dec. 12, 2012).

Monday, January 21, 2013

May Day

This is another in our series of posts following up the Upper Big Branch coal mine disaster in April 2010. As reported in the Wall Street Journal* a former superintendent in the Massey Energy mine, Gary May, was sentenced to 21 months in prison for his part in the accident. Specifically May “warned miners that inspectors were coming and ordered subordinates to falsify a record book and disable a methane monitor so workers could keep mining coal.”

The U.S. attorney in charge of the case is basing criminal indictments on a conspiracy that he believes “certainly went beyond Upper Big Branch.” In other words the government is working its way up the food chain at Massey with lower level managers such as May pleading guilty and cooperating with prosecutors. The developments here are worth keeping an eye on as it is relatively rare to see the string pulled so extensively in cases of safety failures at the operating level. The role and influence of senior executives will ultimately come under scrutiny and their culpability determined not on the slogans they promulgated but on their actions.


* K. Maher, “Former Mine OfficialSentenced to 21 Months,” Wall Street Journal (Jan. 17, 2013).

Thursday, January 17, 2013

Adm. Hyman Rickover – Systems Thinker

The TMI-2 accident occurred in 1979. In 1983 the plant owner, General Public Utilities Corp. (GPU), received a report* from Adm. Hyman Rickover (the “Father of the Nuclear Navy”) recommending that GPU be permitted by the NRC to restart the undamaged TMI Unit 1 reactor. We are not concerned with the report's details or conclusions but one part caught our attention.

The report begins by describing Rickover's seven principles for successful nuclear operation. One of these principles is the “Concept of Total Responsibility” which he explains as follows: “Operating nuclear plants safely requires adherence to a total concept wherein all elements are recognized as important and each is constantly reinforced. Training, equipment maintenance, technical support, radiological control, and quality control are essential elements but safety is achieved through integrating them effectively in operating decisions.” (p. 9, emphasis added)

We think the foregoing sounds like version 1.0 of points we have been emphasizing in this blog, namely:
  • Performance over time is the result of relationships and interactions among organizational components, in other words, the system is what's important.
  • Decisions are where the rubber meets the road in terms of goals, priorities and resource allocation; the extant safety culture provides a context for decision-making.
  • Safety performance is an emergent organizational property, a result of system activities, and cannot be predicted by examining individual system components.
We salute Adm. Rickover for his prescient insights.


* Adm. H.G. Rickover, “An Assessment of the GPU Nuclear Corporation Organization and Senior Management and Its Competence to Operate TMI-1” (Nov. 19, 1983). Available from Dickinson College library here.