Thursday, November 29, 2012

The Mouse Runs Up the Clock (at Massey Energy)

We are all familiar with the old nursery rhyme: “Hickory, dickory, dock, the mouse ran up the clock.”  This may be an apt description for the rising waters of federal criminal prosecution in the Massey coal mine explosion investigation.  As reported in the Nov. 28, 2012 Wall Street Journal,* the former president of one of the Massey operating units unrelated to the Upper Big Branch mine has agreed to plead guilty to felony conspiracy charges including directing employees to violate safety laws.  The former president is cooperating with prosecutors (in other words, look out above) and as noted in the Journal article, “The expanded probe ‘strongly suggests’ prosecutors are ‘looking at top management’…"   Earlier this year, a former superintendent at the Upper Big Branch pleaded guilty to conspiracy charges. 

Federal prosecutors allege that safety rules were routinely violated to maximize profits.  As stated in the Criminal Information against the former president, “Mine safety and health laws were routinely violated at the White Buck Mines and at other coal mines owned by Massey, in part because of a belief that consistently following those laws would decrease coal production.” (Criminal Information, p. 4)**  The Information goes on to state:  “Furthermore, the issuance of citations and orders by MSHA [Mine Safety and Health Administration], particularly certain kinds of serious citations and orders, moved the affected mine closer to being classified as a mine with a pattern or potential pattern of violations.  That classification would have resulted in increased scrutiny of the affected mine by MSHA…” (Crim. Info. p.5)  Thus it is alleged that not only production priorities - the core objective of many businesses - but even the potential for increased scrutiny by a regulatory authority was sufficient to form the basis for a conspiracy. 

Every day managers and executives in high risk businesses make decisions to sustain and/or improve production and to minimize the exposure of the operation to higher levels of regulatory scrutiny.  The vast majority of those decisions are legitimate and don’t compromise safety or inhibit regulatory functions.  Extreme examples that do violate safety and legal requirements, such as the Massey case, are easy to spot.  But one might begin to wonder what exactly is the boundary separating legitimate pursuit of these objectives and decisions or actions that might (later) be interpreted as having the intent to compromise safety or regulation?  How important is perception to drawing the boundary - where the context can frame a decision or action in markedly different colors?  Suppose in the Massey situation, the former president instead of providing advance warnings and (apparently) explicitly tolerating safety violations, had limited the funding of safety activities, or just squeezed total budgets?  Same or different? 

*  K. Maher, "Mine-Safety Probe Expands," Wall Street Journal online (Nov. 28, 2012) may only be available only to subscribers.

**  U.S. District Court Southern District of West Virgina, “Criminal Information for Conspiracy to Defraud the United States: United States of America v. David C. Hughart” (Nov. 28, 2012).

Tuesday, November 20, 2012

BP/Deepwater Horizon: Upping the Stakes

Anyone who thought safety culture and safety decision making was an institutional artifact, or mostly a matter of regulatory enforcement, might want to take a close look at what is happening on the BP/Deepwater Horizon front these days. Three BP employees have been criminally indicted - and two of those indictments bear directly on safety in operational decisions. The indictments of the well-site leaders, the most senior BP personnel on the platform, accuses them of causing the deaths of 11 crewmen aboard the Deepwater Horizon rig in April 2010 through gross negligence, primarily by misinterpreting a crucial pressure test that should have alerted them that the well was in trouble.*

The crux of the matter relates to the interpretation of a pressure test to determine whether the well had been properly sealed prior to being temporarily abandoned. Apparently BP’s own investigation found that the men had misinterpreted the test results.

The indictment states, “The Well Site Leaders were responsible for...ensuring that well drilling operations were performed safely in light of the intrinsic danger and complexity of deepwater drilling.” (Indictment p.3)

The following specific actions are cited as constituting gross negligence: “...failed to phone engineers onshore to advise them ...that the well was not secure; failed to adequately account for the abnormal readings during the testing; accepted a nonsensical explanation for the abnormal readings, again without calling engineers onshore to consult…” (Indictment p.7)

The willingness of federal prosecutors to advance these charges should (and perhaps are intended to) send a chill down every manager’s spine in high risk industries. While gross negligence is a relatively high standard, and may or may not be provable in the BP case, the actions cited in the indictment may not sound all that extraordinary - failure to consult with onshore engineers, failure to account for “abnormal” readings, accepting a “nonsensical” explanation. Whether this amounts to “reckless” or willful disregard for a known risk is a matter for the legal system. As an article in the Wall Street Journal notes, “There were no federal rules about how to conduct such a test at the time. That has since changed; federal regulators finalized new drilling rules last week that spell out test procedures.”**

The indictment asserts that the men violated the “standard of care” applicable to the deepwater oil exploration industry. One might ponder what federal prosecutors think the “standard of care” is for the nuclear power generation industry.

Clearly the well site leaders made a serious misjudgment - one that turned out to have catastrophic consequences. But then consider the statement by the Assistant Attorney General, that the accident was caused by “BP’s culture of privileging profit over prudence.” (WSJ article)   Are there really a few simple, direct causes of this accident or is this an example of a highly complex system failure? Where does culpability for culture lie?  Stay tuned.

* U.S. District Court Eastern District of Louisiana, “Superseding Indictment for Involuntary Manslaughter, Seaman's Manslaughter and Clean Water Act: United States of America v. Robert Kaluza and Donald Vidrine,” Criminal No. 12-265.

** T. Fowler and R. Gold, “Engineers Deny Charges in BP Spill,” Wall Street Journal online (Nov. 18, 2012).

Thursday, November 1, 2012

Practice Makes Perfect

In this post we call attention to a recent article from The Wall Street Journal* that highlights an aspect of safety culture “learning” that may not be appreciated with approaches currently in vogue in the nuclear industry.  The gist of the article is that, just as practice is useful in mastering complex, physically challenging activities, it may also have value in honing the skills inherent in complex socio-technical issues.

“Research has established that fast, simple feedback is almost always more effective at shaping behavior than is a more comprehensive response well after the fact. Better to whisper "Please use a more formal tone with clients, Steven" right away than to lecture Steven at length on the wherefores and whys the next morning.”

Our sense is current efforts to instill safety culture norms and values tend toward after-the-fact lectures and “death by PowerPoint” approaches.  As the article correctly points out, it is “shaping behavior” that should be the goal and is something that benefits from feedback, and “An explicit request can normalize the idea of ‘using’ rather than passively "taking" feedback.”

It’s not a long article so we hope readers will just go ahead and click on the link below.

*  Lemov, D., “Practice Makes Perfect—And Not Just for Jocks and Musicians,” Wall Street Journal online (Oct. 26, 2012).