Showing posts with label Pithy. Show all posts
Showing posts with label Pithy. Show all posts

Saturday, April 1, 2017

Totally Nude, Naked Nuclear Safety Culture

I admit it.  The title is a cheap April Fools trick to draw new, perhaps less conventional, visitors to Safetymatters.  The only thing you’ll see here is the naked truth about nuclear safety culture (NSC), which we have been preaching about for years.

We’ve repeatedly listed the ingredients for a strong NSC: decision-making that recognizes goal conflicts and establishes clear, consistent safety priorities; an effective corrective action program; a mental model of organizational functioning that considers interrelationships and feedback loops among key variables; a compensation plan that rewards safety performance; and leadership that walks the talk on NSC.

We’ve also said that, absent constant maintenance, NSC will invariably decay over time because of complacency and system dynamics.  Complacency leads to hubris (“It can’t happen here”) and opens the door for the drift toward failure that occurs with normalization of deviance and group think.  System dynamics include constant environmental adaptations, goal conflicts, shifting priorities, management incentives tilted toward production and cost achievements, and changing levels of intra-organizational trust. 

NSC in practice appears to have approached an asymptote to the ideal.  Problems still occur; currently Entergy, TVA and AREVA are in the hot seat.  We have to ask: Is the industry’s steady-state NSC a low-intensity war of Whac-a-Mole?  You be the judge.

Thursday, January 17, 2013

Adm. Hyman Rickover – Systems Thinker

The TMI-2 accident occurred in 1979. In 1983 the plant owner, General Public Utilities Corp. (GPU), received a report* from Adm. Hyman Rickover (the “Father of the Nuclear Navy”) recommending that GPU be permitted by the NRC to restart the undamaged TMI Unit 1 reactor. We are not concerned with the report's details or conclusions but one part caught our attention.

The report begins by describing Rickover's seven principles for successful nuclear operation. One of these principles is the “Concept of Total Responsibility” which he explains as follows: “Operating nuclear plants safely requires adherence to a total concept wherein all elements are recognized as important and each is constantly reinforced. Training, equipment maintenance, technical support, radiological control, and quality control are essential elements but safety is achieved through integrating them effectively in operating decisions.” (p. 9, emphasis added)

We think the foregoing sounds like version 1.0 of points we have been emphasizing in this blog, namely:
  • Performance over time is the result of relationships and interactions among organizational components, in other words, the system is what's important.
  • Decisions are where the rubber meets the road in terms of goals, priorities and resource allocation; the extant safety culture provides a context for decision-making.
  • Safety performance is an emergent organizational property, a result of system activities, and cannot be predicted by examining individual system components.
We salute Adm. Rickover for his prescient insights.


* Adm. H.G. Rickover, “An Assessment of the GPU Nuclear Corporation Organization and Senior Management and Its Competence to Operate TMI-1” (Nov. 19, 1983). Available from Dickinson College library here.

Thursday, November 1, 2012

Practice Makes Perfect

In this post we call attention to a recent article from The Wall Street Journal* that highlights an aspect of safety culture “learning” that may not be appreciated with approaches currently in vogue in the nuclear industry.  The gist of the article is that, just as practice is useful in mastering complex, physically challenging activities, it may also have value in honing the skills inherent in complex socio-technical issues.

“Research has established that fast, simple feedback is almost always more effective at shaping behavior than is a more comprehensive response well after the fact. Better to whisper "Please use a more formal tone with clients, Steven" right away than to lecture Steven at length on the wherefores and whys the next morning.”

Our sense is current efforts to instill safety culture norms and values tend toward after-the-fact lectures and “death by PowerPoint” approaches.  As the article correctly points out, it is “shaping behavior” that should be the goal and is something that benefits from feedback, and “An explicit request can normalize the idea of ‘using’ rather than passively "taking" feedback.”

It’s not a long article so we hope readers will just go ahead and click on the link below.

*  Lemov, D., “Practice Makes Perfect—And Not Just for Jocks and Musicians,” Wall Street Journal online (Oct. 26, 2012).

Friday, October 26, 2012

Communicating Change

One of our readers suggested we look at Communicating Change* by T.J. and Sandar Larkin.  The Larkins are consultants so I was somewhat skeptical of finding any value for safety culture but they have significant experience and cite enough third-party references (think: typical Wikipedia item) to give the book some credibility. 

The book presents three principles for effectively communicating change, i.e., delivering a top-down message that ultimately results in better performance or acceptance of necessary innovations, workplace disruptions or future unknowns.

Transmit the message through the first-line supervisors.  They will be the ones who have to explain the message and implement the changes on a day-to-day basis after the executives board their plane and leave.  Senior management initiatives to communicate directly with workers undermines supervisors’ influence.

Communicate face-to-face.  Do not rely on newsletters, videos, mass e-mail and other one-way communication techniques; the message is too easily ignored or misunderstood.  Face-to-face, from the boss, may be even more important in the age of social media where people can be awash in a sea of (often conflicting) information.

Make changes relevant to each work area, i.e., give the supervisor the information, training and tools necessary to explain exactly what will change for the local work group, e.g., different performance standards, methods, equipment, etc.

That’s it, although the book goes on for almost 250 pages to justify the three key principles and explain how they might be implemented.  (The book is full of examples and how-to instructions.)

Initially I thought this approach was too simplistic, i.e., it wouldn’t help anyone facing the challenge of trying to change safety-related behavior.  But simple can cut through the clutter of well-meaning but complicated change programs, one size fits all media and training, and repetitive assessments.

This book is not the complete answer but it does provide a change agent with some perspective on how one might go about getting the individual contributors (trade, technical or professional) at the end of the food chain to understand, respond to and eventually internalize required behavioral changes. 

Please contact us if you have a suggestion for a resource that you’d like us to review.


 *  T. Larkin and S. Larkin, Communicating Change: Winning Employee Support for New Business Goals (New York: McGraw-Hill, 1994).

Saturday, May 26, 2012

Most of Us Cheat—a Little

A recent Wall Street Journal essay* presented the author’s research into patterns of cheating by people.  He found that a few people are honest, a few people are total liars and most folks cheat a little.  Why?  “. . . the behavior of almost everyone is driven by two opposing motivations. On the one hand, we want to benefit from cheating and get as much money and glory as possible; on the other hand, we want to view ourselves as honest, honorable people. Sadly, it is this kind of small-scale mass cheating, not the high-profile cases, that is most corrosive to society.”

This behavioral tendency can present a challenge to maintaining a strong safety culture.  Fortunately, the author found one type of intervention that decreased the incidence of lying: “. . . reminders of morality—right at the point where people are making a decision—appear to have an outsize effect on behavior.”  In other words, asking subjects to think about the 10 Commandments or the school honor code before starting the research task resulted in less cheating.  So did having people sign their insurance forms at the top, before reporting their annual mileage, rather than the bottom, after the fudging had already been done.  Preaching and teaching about safety culture has a role, but the focus should be on the point where safety-related decisions are made and actions occur.    

I don’t want to oversell these findings.  Most of the research involved individual college students, not professionals working in large organizations with defined processes and built-in checks and balances.  But the findings do suggest that zero tolerance for certain behaviors has its place.  As the author concludes: “. . . although it is obviously important to pay attention to flagrant misbehaviors, it is probably even more important to discourage the small and more ubiquitous forms of dishonesty . . . This is especially true given what we know about the contagious nature of cheating and the way that small transgressions can grease the psychological skids to larger ones.”


*  D. Ariely, “Why We Lie,” Wall Street Journal online (May 26, 2012). 

Monday, January 10, 2011

Pick Any Two

Last week principal findings of the BP Oil Spill Presidential Commission were released.   Not surprisingly it cited root causes that were “systemic”, decisions without adequate consideration of risks, and failures of regulatory oversight.  It also cited a lack of a culture of safety at the companies involved in the Deepwater Horizon.  We came across an interesting entry in a blog tied to an article in the New York Times by John Broder on January 5, 2011, “Blunders Abounded Before Gulf Oil Spill, Panel Says”.  We thought it was worth passing on. 

Comment No. 7 of 66 submitted by:
Jim S.
Cleveland
January 5th, 2011
7:23 pm

“A fundamental law of engineering (or maybe of the world in general) is "Cheaper, Faster, Better: Pick Any Two".  

Clearly those involved, whether deliberately or by culture, chose Cheaper and Faster.”

Thursday, December 23, 2010

Ian Richardson, Safety Culturist

Ian Richardson, the British actor, may not be the first person who leaps to mind as someone with insight into the art of assessing safety culture.  But in an episode in the second volume of the BBC series "House of Cards," Ian’s character, the UK Prime Minister, observes:

“You can get people to say anything you want them to say, but getting them to do what they say is another thing.”

And with that thought, we wish you Happy Holidays.

Thursday, September 17, 2009

Quote of the Day

I came across the following in a discussion forum related to Davis Besse issues.

“So it appears that man is capable of controlling the climate, but not the atom.  God is laughing.”

While not exactly on point for SafetyMatters, it was irresistible.

Wednesday, July 29, 2009

"Beaten to Death by Croutons"

In the July 27, 2009 Wall Street Journal in the Bookshelf column, there is a review of "Say Everything", a book about blogging.  In the review, there is a comment that "reading blogs is like being beaten to death by croutons".  We hope that readers of our blog do not experience such a fate.  The column goes on to note that the best blogs are those that are concise, current, and precisely targeted.  That is the goal for this blog and we hope it is being met.