Thursday, May 26, 2011

Upper Big Branch 1

A few days ago the Governor’s Independent Investigation Panel issued its report on the Upper Big Branch coal mine explosion of April 5, 2010.  The report is over 100 pages and contains considerable detail on the events and circumstances leading up to the disaster, coal mining technology and safety issues.  It is well worth reading for anyone in the business of assuring safety in a complex and high risk enterprise.  We anticipate doing several blog posts on material from the report but wanted to start with a brief quote from the forward to the report, summarizing its main conclusions.

“A genuine commitment to safety means not just examining miners’ work practices and behaviors.  It means evaluating management decisions up the chain of command - all the way to the boardroom - about how miners’ work is organized and performed.”*

We believe this conclusion is very much on the mark for safety management and for the safety culture that supports it in a well managed organization.  It highlights what to us has appeared to be an over-emphasis in the nuclear industry on worker practices and behaviors - and “values”.   And it focuses attention on management decisions - decisions that maintain an appropriate weight to safety in a world of competing priorities and interests - as the sine qua non of safety.  As we have discussed in many of our posts, we are concerned with the emphasis by the nuclear industry on safety culture surveys and training in safety culture principles and values as the primary tools of assuring a strong safety culture.  Rarely do culture assessments focus on the decisions that underlie the management of safety to examine the context and influence of factors such as impacts on operations, availability of resources, personnel incentives and advancement, corporate initiatives and goals, and outside factors such as political pressure.  The Upper Big Branch report delves into these issues and builds a compelling basis for the above conclusion, a conclusion that is not limited to the coal industry.


*  Governor’s Independent Investigation Panel, “Report to the Governor: Upper Big Branch,” National Technology Transfer Center, Wheeling Jesuit University (May 2011), p. 4.

Thursday, May 19, 2011

Mental Models and Learning

A recent New York Times article on teaching methods* caught our eye.  It reported an experiment by college physics professors to improve their freshmen students’ understanding and retention of introductory material.  The students comprised two large (260+) classes that usually were taught via lectures.  For one week, teaching assistants used a collaborative, team-oriented approach for one of the classes.  Afterward, this group scored higher on the test than the group that received the traditional lecture.  

One of the instructors reported, “. . . this class actively engages students and allows them time to synthesize new information and incorporate it into a mental model . . . . When they can incorporate things into a mental model, we find much better retention.”

We are big believers in mental models, those representations of the world that people create in their minds to make sense of information and experience.  They are a key component of our system dynamics approach to understanding and modeling safety culture.  Our NuclearSafetySim model illustrates how safety culture interacts with other variables in organizational decision-making; a primary purpose for this computer model is to create a realistic mental model in users’ minds.

Because this experiment helped the students form more useful mental models, our reaction to it is generally favorable.  On the other hand, why is the researchers’ “insight” even news?  Why wouldn’t a more engaging approach lead to a better understanding of any subject?  Don’t most of you develop a better understanding when you do the lab work, code your own programs, write the reports you sign, or practice decision-making in a simulated environment?

*  B. Carey, “Less Talk, More Action: Improving Science Learning,” New York Times (May 12, 2011).

Tuesday, May 10, 2011

Shifting the Burden

Pitot tube
This post emanates from the ongoing investigations of the crash of Air France flight 447 from Rio de Janeiro to Paris.  In some respects it is a follow-up to our January 27, 2011 post on Air France’s safety culture.  An article in the New York Times Sunday Magazine* explores some of the mysteries surrounding the loss of the plane in mid-Atlantic.  One of the possible theories for the crash involves the pitot tubes used on the Airbus plane.  Pitot tubes are instruments used on aircraft to measure air speed.  The pitot tube measures the difference between total (stagnation) and static pressure to determine dynamic pressure and therefore velocity of the air stream.  Care must be taken to assure that the pitot tubes do not become clogged with ice or other foreign matter as it would interrupt or corrupt the airspeed signal provided to the pilots and the auto-pilot system. 

On the flight 447 aircraft, three Thales AA model pitot tubes were in use.  They are produced by a French company and cost approximately $3500 each.  The Times article goes on to explain:

"...by the summer of 2009, the problem of icing on the Thales AA was known to be especially common….Between 2003 and 2008, there were at least 17 cases in which the Thales AA had problems on the Airbus A330 and its sister plane, the A340.  In September 2007, Airbus issued a ‘service bulletin’ suggesting that airlines replace the AA pitots with a newer model, the BA, which was said to work better in ice.”

Air France’s response to the service bulletin established a policy to replace the AA tubes “only when a failure occurred”.  A year later Air France then asked Airbus for “proof” that the model BA tubes worked better in ice.  It took Airbus another 6-7 months to perform tests that demonstrated the superior performance of the BA tubes, following which Air France proceeded with implementing the recommended change for its A330 aircraft.  Unfortunately the new probes had not yet been installed at the time of flight 447.

Much is still unknown about whether in fact the pitot tubes played a role in the crash of flight 447 and of the details of Air France’s consideration of deploying replacements.  But there is a sufficient framework to pose some interesting questions regarding how safety considerations were balanced in the process, and what might be inferred about the Air France safety culture.  Most clearly it highlights how fundamental the decision making process is to safety culture.

What is clear is that Air France’s approach to this problem “shifted the burden” from assuring that something was safe to proving that it was unsafe.  In legal usage this involves transferring the obligation to prove a fact in controversy from one party to another.  Or in systems thinking (which you may have noticed we strongly espouse) it denotes a classic dynamic archetype - a problem arises, it can be ameliorated through either a short term, symptom based response or a fundamental solution that may take additional time and/or resources to implement.  Choosing the short term fix provides relief and reinforces the belief in the efficacy of the response.  Meanwhile the underlying problem goes unaddressed.  For Air France, the service bulletin created a problem.  Air France could have immediately replaced the pitot tubes or undertaken its own assessment of pitot tubes with replacement to follow.  This would have taken time and resources.  Nor did Air France appear to try to address the threshold question of whether the existing AA model instruments were adequate - in nuclear industry terms, were they “operable” and able to perform their safety function?  Air France apparently did not even implement interim measures such as retraining to improve pilot’s recognition and response to pitot tube failures or incorrect readings.  Instead, Air France shifted the burden back to Airbus to “prove” their recommendation.  The difference between showing that something is not safe versus that it is safe is as wide as, well, the Atlantic Ocean.

What we find particularly interesting about shifting the burden is that it is just another side of the complacency coin.  Most people engaged in safety culture science recognize that complacency is a potential contributor to the decay and loss of effectiveness of safety culture.  Everything appears to be going OK so there is less need to pursue issues, particularly those lacking safety impact clarity.  Not pursuing root causes, not verifying corrective action efficacy, loss of questioning attitude and lack of resources could all be telltale signs of complacency.  The interesting thing about shifting the burden is that it yields much the same result - but with the appearance that action is being taken. 

The footnote to the story is the response of Air Caraibes to similar circumstances in this time frame.  The Times article indicates Air Caraibes experienced two “near misses” with Thales AA pitot tubes on A330 aircraft.  They immediately replaced the parts and notified regulators.


*  W.S. Hylton, "What Happened to Air France Flight 447?" New York Times Magazine (May 4, 2011).