Tuesday, November 17, 2015

Foolproof by Greg Ip: Insights for the Nuclear Industry

This book* is primarily about systemic lessons learned from the 2008 U.S. financial crisis and, to a lesser extent, various European euro crises. Some of the author’s observations also apply to the nuclear industry.

Ip’s overarching thesis is that steps intended to protect a system, e.g., a national or global financial system, may over time lead to over-confidence, increased risk-taking and eventual instability.  Stability breeds complacency.**  As we know, a well-functioning system creates a series of successful outcomes, a line of dynamic non-events.  But that dynamic includes gradual changes to the system, e.g., innovation or adaptation to the environment, that may increase systemic risk and result in a new crisis or unintended consequences

He sees examples that evidence his thesis in other fields.  For automobiles, the implementation of anti-lock braking systems leads some operators to drive more recklessly.  In football, better helmets mean increased use of the head as a weapon and more concussions and spinal injuries.  For forest fires, a century of fire suppression has led to massive fuel build-ups and more people moving into forested areas.  For flood control, building more and higher levees has led to increased economic development in historically flood-prone areas.  As a result, both fires and floods can have huge financial losses when they eventually occur.  In all cases, well-intentioned system “improvements” lead to increased confidence (aka loss of fear) and risk-taking, both obvious and implicit.  In short, “If the surroundings seem safer, the systems tolerate more risk.” (p. 18)

Ip uses the nuclear industry to illustrate how society can create larger issues elsewhere in a system when it effects local responses to a perceived problem.  Closing down nuclear plants after an accident (e.g., Fukushima) or because of green politics does not remove the demand for electric energy.  To the extent the demand shortfall is made up with hydrocarbons, additional people will suffer from doing the mining, drilling, processing, etc. and the climate will be made worse.

He cites the aviation industry as an example of a system where near-misses are documented and widely shared in an effort to improve overall system safety.  He notes that the few fatal accidents that occur in commercial aviation serve both as lessons learned and keep those responsible for operating the system (pilots and controllers) on their toes.

He also makes an observation about aviation that could be applied to the nuclear industry: “It is almost impossible to improve a system that never has an accident. . . . regulators are unlikely to know whether anything they propose now will have provable benefits; it also means that accidents will increasingly be of the truly mysterious, unimaginable variety . . .” (p. 252)

Speaking of finance, Ip says “A huge part of what the financial system does is try to create the fact—and at times the illusion—of safety.  Usually, it succeeds; . . . On those rare occasions when it fails, the result is panic.” (p. 86)  Could this description also apply to the nuclear industry? 

Our Perspective

Ip’s search for systemic, dynamic factors to explain the financial crisis echoes the type of analysis we’ve been promoting for years.  Like us, he recognizes that people hold different world views of the same system.  Ip contrasts the engineers and the ecologists:  “Engineers satisfy our desire for control, . . . civilization’s needs to act, to do something, . . .” (p. 278)  Ecologists believe “it’s the nature of risk to find the vulnerabilities we missed, to hit when least expected, to exploit the very trust in safety we so assiduously cultivate with all our protection . . .” (p. 279)

Ip’s treatment of the nuclear industry, while positive, is incomplete and somewhat simplistic.  It’s really just an example, not an industry analysis.  His argument that shutting down nuclear plants exacerbates climate harm could have come from the NEI playbook.  He ignores the impact of renewables, efficiency and conservation.

He doesn’t discuss the nuclear industry’s penchant for secrecy, but we have and believe it feeds the public’s uncertainty about the industry's safety.  As Ip notes, “People who crave certainty cannot tolerate even a slight increase in uncertainty, and so they flee not just the bad banks, the bad paper, and the bad country, but everything that resembles them, . . .” (p. 261)  If a system that is assumed [or promoted] to be safe has a crisis, even a local one, the result is often panic. (p. 62)

He mentions high reliability organizations (HROs) focusing on their avoiding catastrophe and “being a little bit scared all of the time.” (p. 242)  He does not mention that some of the same systemic factors of the financial system are at work in the world of HROs, including exposure to the corrosive effects of complacency and system drift. (p. 242)

Bottom line: Read Foolproof if you have an interest in an intelligible assessment of the financial crisis.  And remember: “Fear serves a purpose: it keeps us out of trouble.” (p. 19)  “. . . but it can keep us from taking risks that could make us better off.” (p. 159)


*  G. Ip, Foolproof (New York: Little, Brown, 2015).  Ip is a finance and economics journalist, currently with the Wall Street Journal and previously with The Economist.

**  He quotes a great quip from Larry Summers: “Complacency is a self-denying prophecy.”  Ip adds, “If everyone worried about complacency, no one would succumb to it.” (p.263)

Monday, November 2, 2015

Cultural Tidbits from McKinsey

We spent a little time poking around the McKinsey* website looking for items that could be related to safety culture and found a couple.  They do not provide any major insights but they do spur us to think of some questions for you to ponder about your own organization.

One article discussed organizational redesign** and provided a list of recommended rules, including establishing metrics that show if success is being achieved.  Following is one such metric.

“One utility business decided that the key metric for its efficiency-driven redesign was the cost of management labor as a proportion of total expenditures on labor.  Early on, the company realized that the root cause of its slow decision-making culture and high cost structure had been the combination of excessive management layers and small spans of control.  Reviewing the measurement across business units and at the enterprise level became a key agenda item at monthly leadership meetings.” (p. 107)

What percent of total labor dollars does your organization spend on “management”?  Could your organization’s decision making be speeded up without sacrificing quality or safety?  Would your organization rather have the “right” decision (even if it takes a long time to develop) or no decision at all rather than risk announcing a “wrong” one?

A second article discussed management actions to create a longer view among employees,*** including clearly identifying and prioritizing organizational values.  Following is an example of action related to values.

“The pilots of one Middle East–based airline frequently write incident reports that candidly raise concerns, questions, and observations about potential hazards.  The reports are anonymous and circulate internally, so that pilots can learn from one another and improve—say, in handling a particularly tricky approach at an airport or dealing with a safety procedure.  The resulting conversations reinforce the safety culture of this airline and the high value it places on collaboration.  Moreover, by making sure that the reporting structures aren’t punitive, the airline’s executives get better information and can focus their attention where it’s most needed.”

How do your operators and other professionals share experiences and learning opportunities among themselves at your site?  How about throughout your fleet?  Does documenting anything that might be construed as weakness require management review or approval?  Is management (or the overall organization) so fearful of such information being seen by regulators or the public, or discovered by lawyers, that the information is effectively suppressed?  Is your organization paranoid or just applying good business sense?  Do you have a culture that would pass muster as “just”?

Our Perspective


Useful nuggets on management or culture are where you find them.  Others’ experiences can stimulate questions; the answers can help you better understand local organizational phenomena, align your efforts with the company’s needs and build your professional career.


*  McKinsey & Company is a worldwide management consulting firm.


**  S. Aronowitz et al, “Getting organizational redesign right,” McKinsey Quarterly, no. 3 (2015), pp. 99-109.

***  T. Gibbs et al, “Encouraging your people to take the long view,” McKinsey Quarterly (Sept. 2012).