Thursday, August 30, 2012

Failure to Learn

In this post we call your attention to a current research paper* and Wall Street Journal summary article** that sheds some light on how people make decisions to protect against risk.  The specific subject of the research involves response to imminent risk of house damage due to hurricanes.  As the author of the paper states, “The purpose of this paper is to attempt to resolve the question of whether there are, in fact, inherent limits to our ability to learn from experience about the value of protection against low-probability, high-consequence, events.” (p.3)  Also of interest is how the researchers used several simulations to gain insight and quantify how the decisions compared to optimal risk mitigation.

Are these results directly applicable to nuclear safety decisions?  We think not.  But they are far from irrelevant.  They illustrate the value of careful and thoughtful research into the how and why of decisions, the impact of the decision environment and the opportunities for learning to produce better decisions.  It also raises the question, Where is the nuclear industry on this subject?  Nuclear managers are making routinely what are probably the most safety significant decisions of any industry.  But how good are these decisions, and what determines their decision quality?  The industry might contend that the emphasis on safety culture (meaning values and traits) is the sine qua non for assuring decisions that adequately reflect safety.  Bad decision?  Must have been bad culture.  Reiterate culture, assume better decisions to follow. Is this right or is safety culture the wrong blanket or just too small a blanket to try to cover a decision process evolving from a complex adaptive system? 

The basic construct for the first simulation was a contest among participants (college students) with the potential to earn a small cash bonus based on achieving certain performance results.  Each participant was made the owner of a house in a coastal area subject to hurricane intrusion.  During the simulation animation, a series of hurricanes would materialize in the ocean and approach land.  The position, track and strength of the hurricane were continuously updated.  Prior to landfall participants had the choice of purchasing protection against damage for that specific storm, either partial or full protection.  The objective was to maximize total net asset; i.e., the value of the house, less any uncompensated damage and less the cost of any purchased protection.

While the first simulation focused on recurrent short term mitigation decisions, in the second simulation participants had the option to purchase protection that would last at least for the full season but had to purchased prior to a storm occurring.  (A comprehensive description of the simulation and test data are provided in the referenced paper.)

The results indicated that participants significantly under-protected their homes leading to actual losses higher than a “rational” approach to purchasing protection.  While part of the losses was due to purchasing protection unnecessarily, most was due to under protection.  The main driver, according to the researchers, appeared to be that participants over relied on their most recent experience instead of an objective assessment of current risk.  In other words, if in a prior hurricane they experienced no damage, either due to the track of the hurricane or because they had purchased protection, they were less inclined to purchase protection for the next hurricane. 

The simulations reveal limitations in the ability to achieve improved decisions in what was, in essence, a trial and error environment.  Feedback occurred after each storm, but participants did not necessarily use the feedback in an optimal manner “due to a tendency to excessively focus on the immediate disutility of cost outlays” (p.10)  In any event it is clear that the nuclear safety decision making environment is “not ideal for learning—…[since] feedback is rare and noisy…” (p.5)  In fact most feedback in nuclear operations might appear to be affirming since rarely do decisions to take short term risks result in bad outcomes.  It is an environment susceptible to complacency more than learning.

The author concludes with a final question as to whether non-optimal decision making, such as observed in the simulations, can be overcome.  He concludes, “This is may be a difficult since the psychological mechanisms that lead to the biases may be hard-wired; as long as we remain present-focused, prone to chasing short-term rewards and avoiding short term punishment, it is unlikely that individuals and institutions will learn to undertake optimal levels of protective investment by experience alone. The key, therefore, is introducing decision architectures that allow individuals to overcome these biases through, for example, creative use of defaults…” (pp. 30-31)


*  R.J. Meyer, “Failing to Learn from Experience about Catastrophes: The Case of Hurricane Preparedness,” The Wharton School, University of Pennsylvania Working Paper 2012-05 (March 2012).

** C. Shea, “Failing to Learn From Hurricane Experience, Again and Again,” Wall Street Journal (Aug. 17, 2012).

Tuesday, August 28, 2012

Confusion of Properties and Qualities

Dave Snowden
In this post we highlight a provocative, and we believe, accurate criticism of the approach taken by many management scientists in focusing on behaviors as the determinant of desired outcomes.  The source is Dave Snowden, a Welsh lecturer, consultant and researcher in the field of knowledge management.  For those of you interested in finding out more about him, the website http://cognitive-edge.com/main for Cognitive Edge, founded by Snowden, contains an abundant amount of accessible content.

Snowden is a proponent of applying complexity science to inform managers’ decision making and actions.  He is perhaps best known for developing the Cynefin framework which is designed to help managers understand their operational context - based on four archetypes: simple, complicated, complex and chaotic. In considering the archetypes one can see how various aspects of nuclear operations might fit within the simple or complicated frameworks; frameworks where tools such as best practices and root cause analysis are applicable.  But one can also see the limitations of these frameworks in more complex situations, particularly those involving nuanced safety decisions which are at the heart of nuclear safety culture.  Snowden describes “complex adaptive systems” as ones where the system and its participants evolve together through ongoing interaction and influence, and system behavior is “emergent” from that process.  Perhaps most provocatively for nuclear managers is his contention that CDA systems are “non-causal” in nature, meaning one shouldn’t think in terms of linear cause and effect and shouldn’t expect that root cause analysis will provide the needed insight into system failures.

With all that said, we want to focus on a quote from one of Snowden’s lectures in 2008 “Complexity Applied to Systems”.*  In the lecture at approximately the 15:00 minute mark, he comments on a “fundamental error of logic” he calls “confusion of properties and qualities”.  He says:

“...all of management science, they observe the behaviors of people who have desirable properties, then try to achieve those desirable properties by replicating the behaviors”.

By way of a pithy illustration Snowden says, “...if I go to France and the first ten people I see are wearing glasses, I shouldn’t conclude that all Frenchmen wear glasses.  And I certainly shouldn’t conclude if I put on glasses, I will become French.”

For us Snowden’s observation generated an immediate connection to the approach being implemented around the nuclear enterprise.  Think about the common definitions of safety culture adopted by the NRC and industry.  The NRC definition specifies “... the core values and behaviors…” and “Experience has shown that certain personal and organizational traits are present in a positive safety culture. A trait, in this case, is a pattern of thinking, feeling, and behaving that emphasizes safety, particularly in goal conflict situations, e.g., production, schedule, and the cost of the effort versus safety.”**

The INPO definition defines safety culture as “An organization's values and behaviors – modeled by its leaders and internalized by its members…”***

In keeping with these definitions the NRC and industry rely heavily on the results of safety culture surveys to ascertain areas in need of improvement.  These surveys overwhelmingly focus on whether nuclear personnel are “modeling” the definitional traits, values and behaviors.  This seems to fall squarely in the realm described by Snowden of looking to replicate behaviors in hopes of achieving the desired culture and results.  Most often, identified deficiencies are subject to retraining to reinforce the desired safety culture traits.  But what seems to be lacking is a determination of why the traits were not exhibited in the first place.  Followup surveys may be conducted periodically, again to measure compliance with traits.  This recipe is considered sufficient until the next time there are suspect decisions or actions by the licensee. 

Bottom Line

The nuclear enterprise - NRC and industry - appear to be locked into a simplistic and linear view of safety culture.  Values and traits produce desired behaviors; desired behaviors produce appropriate safety management.  Bad results?  Go back to values and traits and retrain.  Have management reiterate that safety is their highest priority.  Put up more posters. 

But what if Snowden’s concept of complex adaptive systems is really an applicable model, and the safety management system is a much more complicated, continuously, self-evolving process?  It is a question well worth pondering - and may have far more impact than much of the hardware centric issues currently being pursued.

Footnote: Snowden is an immensely informative and entertaining lecturer and a large number of his lectures are available via podcasts on the Cognitive Edge website and through YouTube videos.  They could easily provide a stimulating input to safety culture training sessions.

*  Podcast available at http://cognitive-edge.com/library/more/podcasts/agile-conference-complexity-applied-to-systems-2008/. 

**  NRC Safety Culture Policy Statement (June 14, 2011).

***  INPO Definition of Safety Culture (2004).