Showing posts with label BP. Show all posts
Showing posts with label BP. Show all posts

Tuesday, May 31, 2016

The Criminalization of Safety (Part 2)

Risky Business 

As we illustrated in Part 1 of this post a new aspect of safety management risk is possible criminal liability for actions, or inactions, associated with events that did, or could have, safety consequences.  While there has always been the potential for criminal liability it has generally been directed at the corporate level versus individual employees.  Heretofore, “few executives have been on the hook, partly because it is tough for prosecutors to prove an individual had criminal intent in a corporate setting where decision-making is spread among many.” 1,2

The Justice Department has been making a new push to target individuals more frequently to hold them accountable for corporate malfeasance. Much of the criminal liability in recent years has been cropping up in industries other than nuclear, as illustrated in the summary table in Part 1.  The Deepwater Horizon drill rig explosion and the Massey Coal explosion at the Upper Big Branch mine have been leading examples.  More recently the series of scandals involving automobile manufacturers are adding to the record.  And the Flint water contamination situation is also evolving rapidly.  We’ll discuss the significance of these cases and how it could impact the conduct of individuals responsible for safe nuclear operations and the role of regulation.  In particular, under what circumstances criminal liability may attach and whether the potential to be held criminally liable is an effective force in assuring compliant behaviors and ultimately safety. 

Who’s a Criminal?

The various cases are a mix of corporate and individual liability.  All three corporations involved in Deepwater pleaded guilty to various charges and paid very large fines.  In BP’s case, it pleaded guilty to felony manslaughter.  Manslaughter charges against individuals employed by BP were dropped prior to trial.  Individual liability was limited to violations of the Clean Water Act and obstruction of justice (misdemeanors).3

David Uhlmann, a professor at the University of Michigan Law School and former environmental-crimes prosecutor stated, “The Justice Department always seeks to hold individuals accountable for corporate crime, but doing so in the Gulf oil spill meant charging individuals who had no control over the corporate culture that caused the spill.” 4

Other cases followed a similar pattern until Upper Big Branch.  Mostly lower level individuals were being targeted; higher ups were insulated from knowledge or direct involvement in the specific event.  With Massey prosecutors worked their way up the management chain all the way to the CEO.5  However even where there were significant indications of the CEO driving a “production first” culture, the felonies he faced were based on securities fraud and making false statements.  Ultimately he was convicted of violating safety standards and will serve jail time.Fukushima will be another attempt to hold senior management accountable (for something termed, “professional negligence”) but, as previously noted, the case is thought to be difficult.  The Attorney General in the Flint water cases promises more indictments and implies higher ups will be charged.  It remains to be seen whether this targeting of individuals will prove to be a truer preventive measure than other remedies.

Proof of Criminal Behavior is Difficult

Ultimately the prospect of criminal prosecution is fraught with legal and practical obstacles.Current law does not provide a realistic platform for prosecution or sentencing.  Statutory provisions are often limited to misdemeanors.  Making applicable statutes “tougher”, as already proposed by a presidential candidate, is also problematic as it risks over-criminalizing management actions which occur in a complex environment and involve many individuals.  Simple negligence is a problematic ground for criminal liability which generally requires a showing of intent or recklessness.As noted in regard to the VW scandal, “…investigations are ongoing. Whether criminal prosecutions result may be a matter of balancing suspicion of criminal wrongdoing against the standards of proof required - and the track record of recent prosecutions.9

All of the recent experience involving corporations were guilty pleas - the cases did not go to trial and so the standard of proof was not tested. In the BP cases, the DOJ made quite a splash with its indictments of individuals but clearly overreached in charging as the courts and juries quickly dismissed most cases and all felony charges.

Fukushima may be a bit of an oddity as the charges have been mandated by a citizen’s panel.   The charge is “professional negligence” which probably does not have a direct analog in U.S. law.  It does suggest that there will be scrutiny of the actual decisions made by executives which resulted in safety consequences.  In the Flint cases, there will another attempt to review an actual safety decision.  An engineer of the Michigan Department of Water Quality is charged with “misconduct” in authorizing use of the Flint water plant “knowing” it was deficient.  Bears watching.

Competing Priorities and Culture Are Being Cited More Frequently 

Personnel are already in a difficult position when it comes to assuring safety. Corporations inherently, and often quite intentionally, place significant emphasis on achieving operational and business goals.  These goals at certain junctures may conflict with assuring safety.  The de facto reality is that it is up to the operating personnel to constantly rationalize those conflicts in a way that achieves acceptable safely.  Those decisions are rarely obvious, may imply significant benefits or costs, and are subject to ex post critical review with all the benefits of time, hindsight, and no direct decision making responsibility.  Thus the focus may shift from decisions to the culture that may have produced or rationalized those decisions.

The Mine Safety and Health Administration report concluded that the [Upper Big Branch] disaster was "entirely preventable," and was caused in part by a pattern of major safety problems and Massey's efforts to conceal hazards from government inspectors, all of which "reflected a pervasive culture that valued production over safety.”  The Governor of West Virginia’s independent review also found that Massey had “made life difficult” for miners who tried to address safety and built “a culture in which wrongdoing became acceptable.”

As noted in the media, “the automotive industry is caught up in an emissions rigging scandal that exposes systematic cheating and an apparent culture of corrupt ethics."  At VW nine executives so far have been suspended but blame has been focused on a small group of engineers for the misconduct, and VW contends that members of its management board did not know of the decade-long deception.  The idea that a few engineers are responsible “just doesn’t pass the laugh test,’ said John German, a former official at the Environmental Protection Agency…its management culture — confident, cutthroat and insular — is coming under scrutiny as potentially enabling the lawbreaking behavior.10  Mitsubishi Motors is also implicated and investigations are being launched into their peers – including Daimler and Peugeot – to assess the extent of the problem around the world.

Ineffective Regulation is Becoming a Focus 

Last but perhaps the most intriguing evolution in these cases is a new emphasis on the responsibility of the regulator when safety is compromised. There was an extensive and ongoing history of violations at Big Branch Mine, many unresolved, but which did not lead to more stringent enforcement measures by the Mine Safety and Health Administration (MSHA) - such as a shutdown of mine operations.  State of West Virginia investigators claimed that the U.S Department of Labor and its MSHA were equally at fault for failing to act decisively after Massey was issued 515 citations for safety violations at the UBBM in 2009.  “…officials with the MSHA repeatedly defended their agency’s performance. They were quick to point to the fact that the Mine Safety Act places the duty for providing a safe workplace squarely on the shoulders of the employer, insisting that the operator is ultimately responsible for operating a safe mine.” 11

Similar concerns have arisen with regard to Fukushima where safety regulators have been perceived to lack independence from nuclear plant operators. And thinking back to Davis Besse, it seems that the NRC’s actions could have been more intrusive and proactive in determining the condition of the RPV head prior to allowing the inspections to be delayed.

With regard to Flint we noted above that criminal (felony) charges have been brought against a state engineer for “misconduct in office” for authorizing use of the Flint plant.  In addition, he and a supervisor are also charged with misconduct in office for “willfully and knowingly misleading the federal Environmental Protection Agency…”   An expert in environmental crimes notes ”It’s extremely unusual and maybe unprecedented for state and local officials to be charged with criminal drinking water violations, . . .” 12

Whether these pending actions lead to a robust effort to hold regulators and their staff accountable is hard to know.  It bears watching, particularly the contention by MSHA and other regulatory agencies including the NRC, that operators are primarily and ultimately responsible. In Part 3 we’ll share some thoughts on what might other approaches might be effective.

1 P. Loftus, "Criminal Trials of Former Health-Care Executives Set to Begin," The Wall Street Journal (May 22, 2016).

2 The Davis Besse case is prototypical of the way cases were handled in the past.  The corporation pleaded guilty to making false statements and paid a big fine.  Lower level individuals were found guilty of similar charges.  In the Siemaszko trial the court was quite ready to attribute to the defendant knowledge of the content of NRC communications, whether directly prepared by him or not, or acquiescence in materials drafted by others that misrepresented conditions for the RPV.  They also dismissed his contention that he lacked proper expertise.  The court found that he knew and had a motive - keeping the plant running.  There was testimony that higher management was the source of the operational pressure but culpability did not extend beyond the individuals making the actual statements and submittals to the NRC.

3 Transocean Deepwater Inc. also admitted that members of its crew onboard the Deepwater Horizon, acting at the direction of BP’s Well Site Leaders were negligent in failing fully to investigate clear indications that the well was not secure and that oil and gas were flowing into the well.  Halliburton was the supplier of drilling cement to seal the outside of the drilling pipe.  Its guilty plea admitted destroying evidence of instructions to employees to “get rid of” simulation analyses of the event that failed to show that Halliburton’s recommendations to BP would have lowered the risk of a blowout.  [S. Mufson, "Halliburton to Plead Guilty to Destroying Evidence in BP Spill," The Washington Post (July 25, 2013).]  This was an attempt to show that a decision by BP to use fewer pipe centralizers was a serious error contributing to the accident.

4 A. Viswanatha, "U.S. Bid to Prosecute BP Staff in Gulf Oil Spill Falls Flat," The Wall Street Journal (Feb. 27, 2016).

5 Notably the lower level managers pleaded to charges and did not go to trial.  The acquittal of the CEO on felony level charges illustrates the challenges of proving these cases.

6 “Large punitive or compensating settlements, so the argument goes, act as an effective deterrent for mining companies, forcing them to improve their safety systems or face potentially debilitating fines. However, given the revelations about Massey and the several major US mining disasters that have taken place in the last ten years, it's impossible to argue that financial punishment has been a wholly effective scarecrow, especially when companies feel they can game the MSHA system.”  [C. Lo, "Upper Big Branch: the search for justice," (June 20, 2013).]

7 "To this point, research on corporate crime has been, for the most part, overlooked by mainstream criminology. In particular, corporate violations of safety regulations in the coal mining industry have yet to be studied within the field of criminology.”  [C. N. Stickeler,  "A Deadly Way of Doing Business: A Case Study of Corporate Crime in the Coal Mining Industry," University of South Florida (Jan. 2012).]

8 “carelessness which is in reckless disregard for the safety or lives of others, and is so great it appears to be a conscious violation of other people's rights to safety. It is more than simple inadvertence, but it is just shy of being intentionally evil.”  Read more:

9 J. Ewing and G. Bowley, "The Engineering of Volkswagen’s Aggressive Ambition," The New York Times (Dec. 13, 2015).

10 Ibid.

11 The quote is from the case study and references the Governor’s investigation - McAteer, J. D., Beall, K., Beck, J. A., Jr., McGinley, P. C., Monforton, C., Roberts, D. C., Spence, B., & Weise, S. (2011). Upper Big Branch: The April 5, 2010, explosion: A Failure of Basic Coal Mine Safety Practices (Report to the Governor).

12 M. Davey and R. Perez-Pena "Flint Water Crisis Yields First Criminal Charges," New York Times (April 20, 2016). 

Wednesday, September 10, 2014

A Safety Culture Guide for Regulators

This paper* was referenced in a safety culture (SC) presentation we recently reviewed.  It was prepared for Canadian offshore oil industry regulators.  Although not nuclear oriented, it’s a good introduction to SC basics, the different methods for evaluating SC and possible approaches to regulating SC.  We’ll summarize the paper then provide our perspective on it.  The authors probably did not invent anything other than the analysis discussed below but they used a decent set of references and picked appropriate points to highlight.

Introduction to SC and its Importance

The paper provides some background on SC, its origins and definition, then covers the Schein three-tier model of culture and the difference between SC and safety climate.  The last topic is covered concisely and clearly: “. . . safety climate is an outward manifestation of culture. Therefore, safety culture includes safety climate, but safety culture uniquely includes shared values about risk and safety.” (p. 11)  SC attributes (from the Canadian Nuclear Safety Commission) are described.  Under attributes, the authors stress one of our basic beliefs, viz., “The importance of safety is made clear by the decisions managers make and how they allocate resources.” (p. 12)  The authors also summarize the characteristics of High Reliability Organizations, Low Accident Organizations, and James Reason’s model of SC and symptoms of poor SC.

The chapter on SC as a causal factor in accidents contains an interesting original analysis.  The authors reviewed reports on 17 offshore or petroleum related accidents (ranging from helicopter crashes to oil rig explosions) and determined for each accident which of four negative SC factors (Normalization of deviance, Tolerance of inadequate systems and resources, Complacency, Work pressure) were present.  The number of negative SC factors per accident ranged from 0 (three instances) to 4 (also three instances, including two familiar to Safetymatters readers: BP Texas City and Deepwater Horizon).  The negative factor that appeared in the most accidents was Tolerance of inadequate systems and resources (10) and the least was Work pressure (4).

Assessing SC

The authors describe different SC assessment methods (questionnaires, interviews, focus groups, observations and document analysis) and cover the strengths and weaknesses of each method.  The authors note that no single method provides a comprehensive SC assessment and they recommend a multi-method approach.  This is familiar ground for Safetymatters readers; for other related posts, click on the “Assessment” label in the right hand column.

A couple of highlights stand out.  Under observations the authors urge caution:  “The fact that people are being observed is likely to influence their behaviour [the well-known Hawthorne Effect] so the results need to be treated with caution. The concrete nature of observations can result in too much weight being placed on the results of the observation versus other methods.“ (p. 37)  A strength of document analysis is it can evidence how (and how well) the organization identifies and corrects its problems, another key artifact in our view.

Influencing SC

This chapter covers leadership and the regulator’s role.  The section on leadership is well-trod ground so we won’t dwell on it.  It is a major (but in our opinion not the only) internal factor that can influence the evolution of SC.  The statement that “Leaders also shape the safety culture through the allocation of resources” (p. 42) is worth repeating.

The section on regulatory influence is more informative and describes three methods: the regulator’s practices, promotion of SC, and enforcement of SC regulations.  Practices refer to the ways the regulator goes about its inspection and enforcement activities with licensees.  For example, the regulator can promote organizational learning by requiring licensees to have effective incident investigation systems and monitoring how effectively such systems are used in practice. (p. 44)  In the U.S. the NRC constantly reinforces SC’s importance and, through its SC Policy Statement, the expectation that licensees will strive for a strong SC.

Promoting SC can occur through research, education and direct provision of SC-related services.  Regulators in other countries conduct their own surveys of industry personnel to appraise safety climate or they assess an organization’s SC and report their findings to the regulated entity.**  (pp. 45-46)  The NRC both supports and cooperates with industry groups on SC research and sponsors the Regulatory Information Conference (which has a SC module).

Regulation of SC means just what it says.  The authors point out that direct regulation in the offshore industry is controversial. (p. 47)  Such controversy notwithstanding, Norway has developed  regulations requiring offshore companies to promote a positive SC.  Norway’s experience has shown that SC regulations may be misinterpreted or result in unintended consequences. (pp. 48-50)  In the nuclear space, regulation of SC is a popular topic outside the U.S.; the IAEA even has a document describing how to go about it, which we reviewed on May 15, 2013.  More formal regulatory oversight of SC is being developed in Romania and Belgium.  We reported on the former on April 21, 2014 and the latter on June 23, 2014.

Our Perspective

This paper is written by academics but intended for a more general audience; it is easy reading.  The authors score points with us when they say: “Importantly, safety culture moves the focus beyond what happened to offer a potential explanation of why it happened.” (p. 7)  Important factors such as management decision making and work backlogs are mentioned.  The importance of an effective CAP is hinted at.

The paper does have some holes.  Most importantly, it limits the discussion on influencing SC to leadership and regulatory behavior.  There are many other factors that can affect an organization’s SC including existing management systems; the corporate owner’s culture, goals, priorities and policies; market factors or economic regulators; and political pressure.  The organization’s reward system is referred to multiple times but the focus appears to be on lower-level personnel; the management compensation scheme is not mentioned.

Bottom line: This paper is a good introduction to SC attributes, assessments and regulation.

*  M. Fleming and N. Scott, “A Regulator’s Guide to Safety Culture and Leadership” (no date).

**  No regulations exist in these cases; the regulator assesses SC and then uses its influence and persuasion to affect regulated entity behavior.

Tuesday, November 20, 2012

BP/Deepwater Horizon: Upping the Stakes

Anyone who thought safety culture and safety decision making was an institutional artifact, or mostly a matter of regulatory enforcement, might want to take a close look at what is happening on the BP/Deepwater Horizon front these days. Three BP employees have been criminally indicted - and two of those indictments bear directly on safety in operational decisions. The indictments of the well-site leaders, the most senior BP personnel on the platform, accuses them of causing the deaths of 11 crewmen aboard the Deepwater Horizon rig in April 2010 through gross negligence, primarily by misinterpreting a crucial pressure test that should have alerted them that the well was in trouble.*

The crux of the matter relates to the interpretation of a pressure test to determine whether the well had been properly sealed prior to being temporarily abandoned. Apparently BP’s own investigation found that the men had misinterpreted the test results.

The indictment states, “The Well Site Leaders were responsible for...ensuring that well drilling operations were performed safely in light of the intrinsic danger and complexity of deepwater drilling.” (Indictment p.3)

The following specific actions are cited as constituting gross negligence: “...failed to phone engineers onshore to advise them ...that the well was not secure; failed to adequately account for the abnormal readings during the testing; accepted a nonsensical explanation for the abnormal readings, again without calling engineers onshore to consult…” (Indictment p.7)

The willingness of federal prosecutors to advance these charges should (and perhaps are intended to) send a chill down every manager’s spine in high risk industries. While gross negligence is a relatively high standard, and may or may not be provable in the BP case, the actions cited in the indictment may not sound all that extraordinary - failure to consult with onshore engineers, failure to account for “abnormal” readings, accepting a “nonsensical” explanation. Whether this amounts to “reckless” or willful disregard for a known risk is a matter for the legal system. As an article in the Wall Street Journal notes, “There were no federal rules about how to conduct such a test at the time. That has since changed; federal regulators finalized new drilling rules last week that spell out test procedures.”**

The indictment asserts that the men violated the “standard of care” applicable to the deepwater oil exploration industry. One might ponder what federal prosecutors think the “standard of care” is for the nuclear power generation industry.

Clearly the well site leaders made a serious misjudgment - one that turned out to have catastrophic consequences. But then consider the statement by the Assistant Attorney General, that the accident was caused by “BP’s culture of privileging profit over prudence.” (WSJ article)   Are there really a few simple, direct causes of this accident or is this an example of a highly complex system failure? Where does culpability for culture lie?  Stay tuned.

* U.S. District Court Eastern District of Louisiana, “Superseding Indictment for Involuntary Manslaughter, Seaman's Manslaughter and Clean Water Act: United States of America v. Robert Kaluza and Donald Vidrine,” Criminal No. 12-265.

** T. Fowler and R. Gold, “Engineers Deny Charges in BP Spill,” Wall Street Journal online (Nov. 18, 2012).

Friday, February 24, 2012

More BP

We have posted numerous times on the travails of BP following the Deepwater Horizon disaster and the contribution of safety culture to these performance results.  BP is back in the news since the trial date for a variety of suits and countersuits is coming up shortly.  We thought we would take the opportunity for a quick update.

The good news is the absence of any more significant events at BP facilities.  In its presentation to investors on 4Q11 and 2012 Strategy, BP highlighted its 10 point moving forward plan, including at the top of the list, “relentless focus on safety and managing risk”.* 

It is impossible for us to assess how substantive and effective this focus has been or will be, but we’ve now heard from BP’s Board member Frank Bowman.  Bowman is head of the Board’s Safety, Ethics and Environment Assurance Committee.  He served on the panel that investigated BP’s US refineries after the Texas City explosion in 2005 and then became a member of BP’s US advisory council; and in November 2010, he joined the main board as a non-executive director.  Basically Bowman’s mission is to help transfer his U.S. nuclear navy safety philosophy to BP’s energy business.

Bowman reports that he has been impressed by the way the safety and operational risk and upstream organizations have taken decisions to suspend operations when necessary. “We’ve recently walked away from several jobs where our standards were not being met by our partners or a contractor. That sends a message heard around the world, and we should continue to do that.”**

Looking for more specifics in the 4Q11 investor presentation, we came across the following “safety performance record”. (BP 4Q11, p. 12)

The charts plot “loss of containment” issues (these are basically releases of hydrocarbons) and personnel injury frequency.  The presentation notes that “Aside from the exceptional activities of the Deepwater Horizon response, steady progress has been made over the last decade.”  Perhaps but we are skeptical that these data are useful for measuring progress in the area of safety culture and management.  For one they both show positive trends over a time period where BP had two major disasters - the Texas City oil refinery fire in 2005 and Deepwater Horizon in 2010.  At a minimum these charts confirm that the tracked parameters do nothing to proactively predict safety health.  As Mr. Bowman notes, “Culture is set by the collective behaviour of an organisation’s leaders… The collective behaviour of BP’s leaders must consistently endorse safety as central to our very being.” (BP Magazine, p. 10)

On the subject of management behavior, the investigations and analyses of Deepwater Horizon consistently noted the contribution of business pressures and competing priorities that lead to poor decisions.  In our September 30, 2010 blog post we included a quote from the then-new BP CEO:

“Mr. Dudley said he also plans a review of how BP creates incentives for business performance, to find out how it can encourage staff to improve safety and risk management.”

The 4Q11 presentation and Mr. Bowman’s interview are noticeably silent on this subject.  The best we could come up with was the following rather cryptic statement in the 4Q11: “We’ve also evolved our approach to performance management and reward, requiring employees to set personal priorities for safety and risk management, focus more on the long term and working as one team.” (BP 4Q11, p. 15)  We’re not sure how “personal priorities” relate to the compensation incentives which were the real focus of the concerns expressed in the accident investigations.

Looking a bit further we uncovered the following in a statement by the chairwoman of BP’s Board Remuneration Committee: “For 2011 the overall policy for executive directors [compensation] will remain largely unchanged…”***  If you guessed that incentives would be based only on meeting business results, you would be right.

In closing we leave with one other comment from Mr. Bowman, one that we think has great salience in the instant situation of BP and for other high risk industries including nuclear generation: “In any business dealing with an unforgiving environment, complacency is your worst enemy. You have to be very careful about what conclusion to draw from the absence of an accident.” (BP Magazine, p. 9) [emphasis added]

BP 4Q11 & 2012 Strategy presentation, p. 8.

**  BP Magazine, Issue 4 2011, p. 9.

***  Letter from the chairman of the remuneration committee (Mar. 2, 2011).

Friday, June 24, 2011

Rigged Decisions?

The Wall Street Journal reported on June 23, 2011* on an internal investigation conducted by Transocean, owner of the Deepwater Horizon drill rig, that placed much of the blame for the disaster on a series of decisions made by BP.  Is this news?  No, the blame game has been in full swing almost since the time of the rig explosion.  But we did note that Transocean’s conclusion was based on a razor sharp focus on:

“...a succession of interrelated well design, construction, and temporary abandonment decisions that compromised the integrity of the well and compounded the risk of its failure…”**  (p. 10)

Note, their report did not place the focus on the “attitudes, beliefs or values” of BP personnel or rig workers, and really did not let their conclusions drift into the fuzzy answer space of “safety culture”.  In fact the only mention of safety culture in their 200+ page report is in reference to a U.S. Coast Guard (USCG) inspection of the drill rig in 2009 which found:

“outstanding safety culture, performance during drills and condition of the rig.” (p. 201)

There is no mention of how the USCG reached such a conclusion and the report does not rely on it to support its conclusions.  It would not be the first time that a favorable safety culture assessment at a high risk enterprise preceded a major disaster.***

We also found the following thread in the findings that reinforce the importance of recognizing and understanding the impact of underlying constraints on decisions:

“The decisions, many made by the operator, BP, in the two weeks leading up to the incident, were driven by BP’s knowledge that the geological window for safe drilling was becoming increasingly narrow.” (p.10)

The fact is, decisions get squeezed all the time resulting in decisions which may be reducing margins but arguably are still “acceptable”.  But such decisions do not necessarily lead to unsafe, much less disastrous, results.  Most of the time the system is not challenged, nothing bad happens, and you could even say the marginal decisions are reinforced.  Are these tradeoffs to accommodate conflicting priorities the result of a weakened safety culture?  Perhaps.  But we suspect that the individuals making the decisions would say they believed safety was their priority and culture may have appeared normal to outsiders as well (e.g., the USCG).  The paradox occurs because decisions can trend in a weaker direction before other, more distinct evidence of degrading culture become apparent.  In this case, a very big explosion.

*  B. Casselman and A. Gonzalez, "Transocean Puts Blame on BP for Gulf Oil Spill," (June 23, 2011).

** "Macondo Well Incident: Transocean Investigation Report," Vol I, Transocean, Ltd. (June 2011).

*** For example, see our August 2, 2010 post.

Wednesday, February 16, 2011

BP Exec Quit Over Safety Before Deepwater Disaster

Today’s Wall Street Journal has an interesting news item about a BP Vice President who quit prior to the Deepwater Horizon disaster because he felt BP "was not adequately committed to improving its safety protocols in offshore drilling to the level of its industry peers." The full article is available here.

Monday, January 10, 2011

Pick Any Two

Last week principal findings of the BP Oil Spill Presidential Commission were released.   Not surprisingly it cited root causes that were “systemic”, decisions without adequate consideration of risks, and failures of regulatory oversight.  It also cited a lack of a culture of safety at the companies involved in the Deepwater Horizon.  We came across an interesting entry in a blog tied to an article in the New York Times by John Broder on January 5, 2011, “Blunders Abounded Before Gulf Oil Spill, Panel Says”.  We thought it was worth passing on. 

Comment No. 7 of 66 submitted by:
Jim S.
January 5th, 2011
7:23 pm

“A fundamental law of engineering (or maybe of the world in general) is "Cheaper, Faster, Better: Pick Any Two".  

Clearly those involved, whether deliberately or by culture, chose Cheaper and Faster.”

Thursday, November 18, 2010

Another Brick in the Wall for BP et al

Yesterday the National Academy of Engineering released their report* on the Deepwater Horizon blowout.  The report includes a critical appraisal of many decisions made during the period when the well was being prepared for temporary abandonment, decisions that in the aggregate decreased safety margins and increased risks.  This Washington Post article** provides a good summary of the report.

The report was written by engineers and scientists and has a certain “Just the facts, ma’am” tone.  It does not specifically address safety culture.  But we have to ask: What can one infer about a culture where the business practices don’t include “any standard practice . . . to guide the tradeoffs between cost and schedule and the safety implications of the many decisions (that is, a risk management approach).”  (p. 15)

We have had plenty to say about BP and the Deepwater Horizon accident.  Click on the BP label below to see all of our related blog entries.

*  Committee for the Analysis of Causes of the Deepwater Horizon Explosion, Fire, and Oil Spill to Identify Measures to Prevent Similar Accidents in the Future; National Academy of Engineering; National Research Council, “Interim Report on Causes of the Deepwater Horizon Oil Rig Blowout and Ways to Prevent Such Events” (2010).

**  D. Cappiello, “Experts: BP ignored warning signs on doomed well,” The Washington Post (Nov 17, 2010).  Given our blog’s focus on the nuclear industry, it’s worth noting that, in an interview, the committee chairman said, “the behavior leading up to the oil spill would be considered unacceptable in companies that work with nuclear power or aviation.”

Tuesday, November 9, 2010

Human Beings . . . Conscious Decisions

In a  New York Times article* dated November 8, 2010, there was a headline to the effect that Fred Bartlit, the independent investigator for the presidential panel on the BP oil rig disaster earlier this year had not found that “cost trumped safety” in decisions leading up to the accident.  The article noted that this finding contradicted determinations by other investigators including those sponsored by Congress.  We had previously posted on this subject, including taking notice of the earlier findings of cost trade-offs, and wanted to weigh in based on this new information.

First we should acknowledge that we have no independent knowledge of the facts associated with the blowout and are simply reacting to the published findings of current investigations.  In our prior posts we had posited that cost pressures could be part of the equation in the leadup to the spill.  On June 8, 2010 we observed:

“ is clear that the environment leading up to the blowout included fairly significant schedule and cost pressures. What is not clear at this time is to what extent those business pressures contributed to the outcome. There are numerous cited instances where best practices were not followed and concerns or recommendations for prudent actions were brushed aside. One wishes the reporters had pursued this issue in more depth to find out ‘Why?’ ”

And we recall one of the initial observations made by an OSHA official shortly after the accident as detailed in our April 26, 2010 post:

“In the words of an OSHA official BP still has a ‘serious, systemic safety problem’ across the company.”

So it appears we have been cautious in reaching any conclusions about BP’s safety management.  That said, we do want to put into context the finding by Mr. Bartlit.  First we would note that he is, by profession, a trial lawyer and may be both approaching the issue and articulating his finding with a decidedly legal focus.  The specific quotes attributed to him are as follows:

“. . . we have not found a situation where we can say a man had a choice between safety and dollars and put his money on dollars” and “To date we have not seen a single instance where a human being made a conscious decision to favor dollars over safety,...”

It is not surprising that a lawyer would focus on culpability in terms of individual actions.  When things go wrong, most industries, nuclear included, look to assign blame to individuals and move on.  It is also worth noting that the investigator emphasized that no one had made a “conscious” decision to favor cost over safety.  We think it is important to keep in mind that safety management and failures of safety decision making may or may not involve conscious decisions.  As we have stated many times in other posts, safety can be undermined through very subtle mechanisms such that even those involved may not appreciate the effects, e.g., the normalization of deviance.  Finally we think the OSHA investigator may have been closer to the truth with his observation about “systemic” safety problems.  It may be that Mr. Bartlit, and other investigators, will be found to have suffered from what is termed “attribution error” where simple explanations and causes are favored and the more complex system-based dynamics are not fully assessed or understood in the effort to answer “Why?”  

* J.M. Broder, "Investigator Finds No Evidence That BP Took Shortcuts to Save Money," New York Times (Nov 8, 2010).

Thursday, September 30, 2010

BP's New Safety Division

It looks like oil company BP believes that creating a new, “global” safety division is part of the answer to their ongoing safety performance issues including most recently the explosion of Deepwater Horizon oil rig in the Gulf of Mexico.  An article in the September 29, 2010 New York Times* quotes BP’s new CEO as stating “safety and risk management [are] our most urgent priority” but does not provide many details of how the initiative will accomplish its goal.  Without seeming to jump to conclusions, it is hard for us to see how a separate safety organization is the answer although BP asserts it will be “powerful”. 

Of more interest was a lesser headline in the article with the following quote from BP’s new CEO:

“Mr. Dudley said he also plans a review of how BP creates incentives for business performance, to find out how it can encourage staff to improve safety and risk management.”

We see this as one of the factors that is a lot closer to the mark for changing behaviors and priorities.  It parallels recent findings by FPL in its nuclear program (see our July 29, 2010 post) and warning flags that we had raised in our July 6 and July 9, 2010 posts regarding trends in U.S. nuclear industry compensation.  Let’s see which speaks the loudest to the organization: CEO pronouncements about safety priority or the large financial incentives that executives can realize by achieving performance goals.  If they are not aligned, the new “division of safety” will simply mean business as usual.

*  The original article is available via the iCyte below.  An updated version is available on the NY Times website.

Thursday, July 22, 2010

Transocean Safety Culture and Surveys

An article in today’s New York Times, “Workers on Doomed Rig Voiced Concern About Safety”  reports on the safety culture on the Deepwater Horizon drilling rig that exploded in the Gulf of Mexico.  The article reveals that a safety culture survey had been performed of the staff on the rig in the weeks prior to the explosion.  The survey was commissioned by Transocean and performed by Lloyd’s Register Group, a maritime and risk-management organization that conducted focus groups and one-on-one interviews with at least 40 Transocean workers.

There are two noteworthy findings from the safety culture survey.  While the headline is that workers voiced safety concerns, the survey results indicate:
“Almost everyone felt they could raise safety concerns and these issues would be acted upon if this was within the immediate control of the rig,” said the report, which also found that more than 97 percent of workers felt encouraged to raise ideas for safety improvements and more than 90 percent felt encouraged to participate in safety-improvement initiatives....But investigators also said, ‘It must be stated at this point, however, that the workforce felt that this level of influence was restricted to issues that could be resolved directly on the rig, and that they had little influence at Divisional or Corporate levels.’ “
This highlights several of the shortcomings of safety culture surveys.  One, the vast majority of respondents to the survey indicated they were comfortable raising safety concerns - yet subsequent events and decisions led to a major safety breakdown.  So, is there a response level that is indicative of how the organization is actually doing business or do respondents tell the the survey takers “what they want to hear”?  And, is comfort in raising a safety concern the appropriate standard, when the larger corporate environment may not be responsive to such concerns or bury them with resource and schedule mandates?  Second, this survey focused on the workers on the rig.  Apparently there was a reasonably good culture in that location but it did not extend to the larger organization.  Consistent with that perception are some of the preliminary reports that corporate was pushing production over safety which may have influenced risk taking on the rig.  This is reminiscent of the space shuttle Challenger where political pressure seeped down into the decision making process, subtly changing the perception of risk at the operational levels of NASA.  How useful are surveys if they do not capture the dynamics higher in the organization or the the insidious ability of exogenous factors to change risk perceptions?
The other aspect of the Transocean surveys came not from the survey results but the rationalization by Transocean of their safety performance.  They “noted that the Deepwater Horizon had seven consecutive years without a single lost-time incident or major environmental event.”  This highlights two fallacies.  One, that the absence of a major accident demonstrates that safety performance is meeting its goals.  Two, that industrial accident rates correlate to safety culture and prudent safety management.  They don’t.  Also, recall our recent posts regarding nuclear compensation where we noted that the the most common metric for determining safety performance incentives in the nuclear industry is industrial accident rate. 

The NY Times article may be found at

Thursday, July 1, 2010

Safety, Cost and Bonuses

An article in the June 29, 2010 Wall Street Journal, “As CEO Hayward Remade BP, Safety, Cost Drives Clashed”, fills in some real world examples of the dynamics of conflicting goals within BP and presumably contributing to the current spill disaster.  The presence of compensation incentives may have played a role in how safety vs. cost decisions were being made.  We will be focusing more on this issue in upcoming posts.

Saturday, June 12, 2010

HBR and BP

There’s a good essay on a Harvard Business Review blog describing how decision-making in high risk enterprises may be affected by BP’s disaster in the Gulf. Not surprisingly, the author’s observations include creating a robust safety culture “where the most stringent safety management will never be compromised for economic reasons.” However, as our Bob Cudlin points out in his comment below the article, such a state may represent a goal rather than reality because safety must co-exist in the same success space as other business and practical imperatives. The real, and arguably more difficult question is: How does safety culture ensure a calculus of safety and risk so that safety measures and management are adequate for the task at hand?

Tuesday, June 8, 2010

Toothpaste and Oil Slicks

At the end of last week came the surprise announcement from the former Dominion engineer, David Collins, that he was withdrawing his allegations regarding his former employer’s safety management and the NRC’s ability to provide effective oversight of safety culture.* The reasons for the withdrawal are still unclear though Collins cited lack of support by local politicians and environmental groups.

What is to be made of this? As we stated in a post at the time of the original allegations, we don’t have any specific insight into the bases for the allegations. We did indicate that how Dominion and the NRC would go about addressing the allegations might present some challenges.

What can be said about the allegations with more certainty is that they will not go away. Like the proverbial toothpaste, allegations can’t be put back into the tube and they will need to be addressed on their merits. We assume that Collins acted in good faith in raising the allegations. In addition, a strong safety culture at Dominion and the NRC should almost welcome the opportunity to evaluate and respond to such matters. A linchpin of any robust safety culture is the encouragement for stakeholders to raise safety concerns and for the organization to respond to them in an open and effective manner. If the allegations turn out to not have merit, it has still been an opportunity for the process to work.

In a somewhat similar vein, the fallout (I am mixing my metaphors) from the oil released into the gulf from the BP spill will remain and have to be dealt with long after the source is capped or shut off. It will serve as an ongoing reminder of the consequences of decisions where safety and business objectives try to occupy a very limited success space. In recent days there have been extensive pieces* in the Wall Street Journal and New York Times delineating in considerable detail the events and decision making leading up to the blowout. These accounts are worthy of reading and digesting by anyone involved in high risk industries. Two things made a particular impression. One, it is clear that the environment leading up to the blowout included fairly significant schedule and cost pressures. What is not clear at this time is to what extent those business pressures contributed to the outcome. There are numerous cited instances where best practices were not followed and concerns or recommendations for prudent actions were brushed aside. One wishes the reporters had pursued this issue in more depth to find out “Why?” Two, the eventual catastrophic outcome was the result of a series of many seemingly less significant decisions and developments. In other words it was a cumulative process that apparently never flashed an unmistakable warning alarm. In this respect it reminds us of the need for safety management to maintain a highly developed “systems” understanding with the ability to connect the dots of risk.

* Links below

Thursday, June 3, 2010

25 Standard Deviation Moves

A Reuters Breakingviews commentary in today’s New York Times makes some interesting arguments about the consequences of the BP oil spill on the energy industry. The commentary draws parallels between BP and the financial implosion that led to Lehman Brothers bankruptcy. ". . . flawed risk management, systemic hazard, and regulatory incompetence" are cited as the common causes, and business models that did not take account of the possibility for "25 standard deviation moves". These factors will inevitably lead to government intervention and industry consolidation as the estimated $27 billion in claims (a current estimate for the BP spill) is ". . . a liability no investor will be comfortable taking, . . ."

While much of this commentary makes sense, we think it is missing a big part of the picture by not focusing on the essential need for much more rigorous safety management. By all reports, the safety performance of BP is a significant outlier in the oil industry; maybe not 25 sigma but 2 or 3 sigma at least. We have posted previously about BP and its safety deficiencies and its apparent inability to learn from past mistakes. There has also been ample analysis of the events leading up to the spill to suggest that a greater commitment to safety could, and likely would, have avoided the blowout. Safety commitment and safety culture provide context, direction and constraints for risk calculations. The potential consequences of a deep sea accident will remain very large, but the probability of the event can and should be brought much lower. Simply configuring energy companies with vastly deep pockets seems unlikely to be a sufficient remedy. For one, money damages are at best an imperfect response to such a disaster. More important, a repeat of this type of event would likely result in a ban on deep sea drilling regardless of the financial resources of the driller.

In the nuclear industry the potentially large consequences of an incident have, so far, been assumed by the government. In this respect there is something of a parallel to the financial crisis where the government stepped in to bail out the "too large to fail" entities. Aside from the obvious lessons of the BP spill, nuclear industry participants have to ensure that their safety commitment is both reality and public perception, or there may be some collateral damage as policy makers think about how high risk industry, including nuclear, liabilities are being apportioned.

Tuesday, June 1, 2010

Underestimating Risk and Cost

Good article in today's New York Times Magazine Preview about economic decision making in general and the oil industry in particular. In summary, when an event is difficult to imagine (e.g., the current BP disaster), people tend to underestimate the probability of it occurring; when it's easier to imagine (e.g., a domestic terrorist attack after 9/11), people tend to overestimate the probability. Now add government caps on liability and decision-making can get really skewed, with unreasonable estimates of both event-related probabilities and costs.

The relevance of this decision-making model to the nuclear industry is obvious but we want to focus on something the article didn't mention: the role of safety culture. Nuclear safety culture guides planning for and reacting to unexpected, negative events. On the planning side, culture can encourage making dispassionate, fact-based decisions regarding unfavorable event probabilities and potential consequences. However, if such an event occurs, then affected personnel will respond consistent with their training and cultural expectations.