Tuesday, November 17, 2015

Foolproof by Greg Ip: Insights for the Nuclear Industry

This book* is primarily about systemic lessons learned from the 2008 U.S. financial crisis and, to a lesser extent, various European euro crises. Some of the author’s observations also apply to the nuclear industry.

Ip’s overarching thesis is that steps intended to protect a system, e.g., a national or global financial system, may over time lead to over-confidence, increased risk-taking and eventual instability.  Stability breeds complacency.**  As we know, a well-functioning system creates a series of successful outcomes, a line of dynamic non-events.  But that dynamic includes gradual changes to the system, e.g., innovation or adaptation to the environment, that may increase systemic risk and result in a new crisis or unintended consequences

He sees examples that evidence his thesis in other fields.  For automobiles, the implementation of anti-lock braking systems leads some operators to drive more recklessly.  In football, better helmets mean increased use of the head as a weapon and more concussions and spinal injuries.  For forest fires, a century of fire suppression has led to massive fuel build-ups and more people moving into forested areas.  For flood control, building more and higher levees has led to increased economic development in historically flood-prone areas.  As a result, both fires and floods can have huge financial losses when they eventually occur.  In all cases, well-intentioned system “improvements” lead to increased confidence (aka loss of fear) and risk-taking, both obvious and implicit.  In short, “If the surroundings seem safer, the systems tolerate more risk.” (p. 18)

Ip uses the nuclear industry to illustrate how society can create larger issues elsewhere in a system when it effects local responses to a perceived problem.  Closing down nuclear plants after an accident (e.g., Fukushima) or because of green politics does not remove the demand for electric energy.  To the extent the demand shortfall is made up with hydrocarbons, additional people will suffer from doing the mining, drilling, processing, etc. and the climate will be made worse.

He cites the aviation industry as an example of a system where near-misses are documented and widely shared in an effort to improve overall system safety.  He notes that the few fatal accidents that occur in commercial aviation serve both as lessons learned and keep those responsible for operating the system (pilots and controllers) on their toes.

He also makes an observation about aviation that could be applied to the nuclear industry: “It is almost impossible to improve a system that never has an accident. . . . regulators are unlikely to know whether anything they propose now will have provable benefits; it also means that accidents will increasingly be of the truly mysterious, unimaginable variety . . .” (p. 252)

Speaking of finance, Ip says “A huge part of what the financial system does is try to create the fact—and at times the illusion—of safety.  Usually, it succeeds; . . . On those rare occasions when it fails, the result is panic.” (p. 86)  Could this description also apply to the nuclear industry? 

Our Perspective

Ip’s search for systemic, dynamic factors to explain the financial crisis echoes the type of analysis we’ve been promoting for years.  Like us, he recognizes that people hold different world views of the same system.  Ip contrasts the engineers and the ecologists:  “Engineers satisfy our desire for control, . . . civilization’s needs to act, to do something, . . .” (p. 278)  Ecologists believe “it’s the nature of risk to find the vulnerabilities we missed, to hit when least expected, to exploit the very trust in safety we so assiduously cultivate with all our protection . . .” (p. 279)

Ip’s treatment of the nuclear industry, while positive, is incomplete and somewhat simplistic.  It’s really just an example, not an industry analysis.  His argument that shutting down nuclear plants exacerbates climate harm could have come from the NEI playbook.  He ignores the impact of renewables, efficiency and conservation.

He doesn’t discuss the nuclear industry’s penchant for secrecy, but we have and believe it feeds the public’s uncertainty about the industry's safety.  As Ip notes, “People who crave certainty cannot tolerate even a slight increase in uncertainty, and so they flee not just the bad banks, the bad paper, and the bad country, but everything that resembles them, . . .” (p. 261)  If a system that is assumed [or promoted] to be safe has a crisis, even a local one, the result is often panic. (p. 62)

He mentions high reliability organizations (HROs) focusing on their avoiding catastrophe and “being a little bit scared all of the time.” (p. 242)  He does not mention that some of the same systemic factors of the financial system are at work in the world of HROs, including exposure to the corrosive effects of complacency and system drift. (p. 242)

Bottom line: Read Foolproof if you have an interest in an intelligible assessment of the financial crisis.  And remember: “Fear serves a purpose: it keeps us out of trouble.” (p. 19)  “. . . but it can keep us from taking risks that could make us better off.” (p. 159)

*  G. Ip, Foolproof (New York: Little, Brown, 2015).  Ip is a finance and economics journalist, currently with the Wall Street Journal and previously with The Economist.

**  He quotes a great quip from Larry Summers: “Complacency is a self-denying prophecy.”  Ip adds, “If everyone worried about complacency, no one would succumb to it.” (p.263)

No comments:

Post a Comment

Thanks for your comment. We read them all. The moderator will publish comments that are related to our content.