Ethics* are rules for conduct or behavior. They are basically a social (as opposed to an individual or psychological) construct and part of group or organizational culture. They can be very specific do’s and don’ts or more general guidelines for behavior.
The Ethics Resource Center (ERC) conducts and publishes regular employee surveys on the degree of non-compliance with workplace ethics in the United States. The surveys focus on instances of misconduct observed by workers—what occurred, who did it, who reported it (if anyone) and what happened to the reporter.
The survey uses a random sample of employees in the for-profit sector. The 2013 survey** ended up with over 6,000 useable responses. There is no indication how many respondents, if any, work in the commercial nuclear industry.
The overall findings are interesting. On the positive side, “Observed misconduct*** is down for the third report in a row and is now at a historic low; the decline in misconduct is widespread; and the percentage of workers who said they felt pressure to compromise standards also fell substantially.” (p.12)
But problems persist. “Workers reported that 60 percent of misconduct involved someone with managerial authority from the supervisory level up to top management. Nearly a quarter (24 percent) of observed misdeeds involved senior managers. Perhaps equally troubling, workers said that 26 percent of misconduct is ongoing within their organization. About 12 percent of wrongdoing was reported to take place company-wide.” (ibid.)
The reporting of misconduct problems has both good and bad news. Lots of workers (63%) who observed misconduct reported it but 21% of those who reported misconduct said they experienced retaliation**** in return. (p. 13)
The report goes on to examine the details behind the summary results and attempts to assign some possible causes to explain observed trends. For example, the authors believe it’s probable that positive trends are related to companies’ ethics and compliance programs that create new norms for worker conduct, i.e., a stronger culture. (p. 16) And a stronger culture is desirable. Returning to the survey, “In 2013, one in five workers (20 percent) reported seeing misconduct in companies where cultures are “strong” compared to 88 percent who witnessed wrongdoing in companies with the weakest cultures.” (p. 18)
The keys to building a stronger ethical culture are familiar to Safetymatters readers: top-level role models, and support by immediate supervisors and peers to do the right thing. In terms of cultural artifacts, a stronger ethical culture is visible in an organization’s processes for training, personnel evaluation and application of employee discipline.
The report goes on to analyze misconduct in depth—who is doing it, what are they doing and how long it has been going on. The authors cover how and why employees report misconduct and suggest ways to increase the reporting rate. They note that increased legal protection for whistleblowers has increased the likelihood that covered workers will report misconduct.
Our Perspective
This report is worth a read. Quite frankly, more workers are willing to report misconduct than I would have predicted. The percentage of reporters who perceive retaliation is disappointing but hardly surprising.
The survey results are more interesting than the explanatory analysis; a reader should keep in mind that this research was conducted by a group that has a vested self-interest in finding the "correct" answers.
Because specific firms and industries are not identified, it’s easy to blow off the results with a flip “Didn’t happen here and can’t happen here because we have a robust SCWE and ECP.” I suggest such parochial reviewers keep in mind that “Pride goes before destruction, and a haughty spirit before a fall..”*****
* Ethics and morals are often used interchangeably but it’s helpful to consider morals as an individual construct, a person’s inner principles of right and wrong. See diffen.com for a more detailed comparison.
** Ethics Resource Center, “2013 National Business Ethics Survey of the U.S. Workforce” (Arlington, VA: 2014). Corporate sponsors include firms familiar to nuclear industry participants, e.g., Bechtel and Edison International.
*** The survey identified 28 specific types of misconduct. Some of interest to the nuclear industry, listed in the order of frequency of occurrence in the survey responses, include abusive behavior or behavior that creates a hostile work environment, lying to employees, discriminating against employees, violations of health or safety regulations, lying to the public, retaliation against someone who has reported misconduct, abusing substances at work, sexual harassment, violation of environmental regulations and falsifying books and/or records. (pp. 41-42)
**** The survey also identified 13 specific types of retaliation experienced by whistleblowers including being ignored or treated differently by supervisors or other employees, being excluded from decisions, verbal abuse, not receiving promotions or raises, reduced hours or pay, relocation or reassignment, harassment at home or online and physical harm to one’s person or property. (p. 45)
***** Proverbs 16:18, Bible (English Standard Version).
Tuesday, February 2, 2016
Monday, January 25, 2016
IAEA Urges Stronger Nuclear Safety Culture in Japan
![]() |
Fukushima |
The IAEA review team praised the NRA’s progress in various areas, such as demonstrating independence and transparency, and made suggestions and recommendations for further improvement, primarily in the area of NRA staff recruiting and development.
The IAEA team also mentioned safety culture (SC), recommending “the NRA and nuclear licensees ‘continue to strengthen the promotion of safety culture, including by fostering a questioning attitude’.”
Our Perspective
We look forward to the IAEA’s final report which is due in about three months. We are especially interested in seeing if there is comprehensive discussion and specific direction with respect to “fostering a questioning attitude.” The Japanese nuclear industry in general and TEPCO (Fukushima’s owner) in particular certainly need to cultivate employees’ willingness to develop and consider open-ended questions such as “what if?” and “what can go wrong?”
More importantly, they also need to instill the necessary backbone to stand up in front of the bosses and ask tough questions and demand straight answers. Lots of folks probably knew the Fukushima seawall wasn’t high enough and the emergency equipment in the basement was subject to flooding but everyone went along with the program. That’s what has to change to create a stronger SC.
* “IAEA praises reform of Japan's nuclear regulator,” World Nuclear News (Jan. 22, 2016).
Posted by
Lewis Conner
0
comments. Click to view/add.
Labels:
Fukushima,
IAEA,
Safety Culture
Sunday, January 17, 2016
A Nuclear Safety Culture for Itinerant Workers
IAEA has published “Radiation Protection of Itinerant Workers”* a report that describes management responsibilities and practices to protect and monitor itinerant workers who are exposed to ionizing radiation. “Itinerant workers” are people who work at different locations “and are not employees of the management of the facility where they are working. The itinerant workers may be self-employed or employed by a contractor . . .” (p. 4) In the real world, such employees have many different names including nuclear nomads, glow boys and jumpers.
The responsibility for itinerant workers’ safety and protection is shared among various organizations and the individual. “The primary responsibility for the protection of workers lies with the management of the operating organization responsible for the facilities . . . however, the employer of the worker (as well as the worker) also bear certain responsibilities.” (p. 2)
Safety culture (SC) is specifically mentioned in the IAEA report. One basic management responsibility is to promote and maintain a robust SC at all organizational levels. (p. 11) Specific responsibilities include providing general training in SC behavior and expectations (p. 131) and, where observation or problems reveal specific needs, targeted individual (or small group) SC training. (p. 93)
Our Perspective
This publication is hardly a great victory for SC; the report provides only the most basic description of the SC imperative. Its major contribution is that it recognizes that itinerant nuclear workers deserve the same safety and protection considerations as other workers at a nuclear facility.
Back in the bad old days, I was around nuclear organizations where their own employees represented the highest social class, contractors were regarded as replaceable parts, and nomadic workers were not exactly expendable but were considered more responsible for managing their own safety and exposure than permanent personnel.
One can make some judgment about a society’s worth by observing how it treats its lowest status members—the poor, the homeless, the refugee, the migrant worker. Nuclear itinerant workers deserve to be respected and treated like the other members of a facility’s team.
* International Atomic Energy Agency, “Radiation protection of itinerant workers,” Safety reports series no. 84 (Vienna, 2015).
The responsibility for itinerant workers’ safety and protection is shared among various organizations and the individual. “The primary responsibility for the protection of workers lies with the management of the operating organization responsible for the facilities . . . however, the employer of the worker (as well as the worker) also bear certain responsibilities.” (p. 2)
Safety culture (SC) is specifically mentioned in the IAEA report. One basic management responsibility is to promote and maintain a robust SC at all organizational levels. (p. 11) Specific responsibilities include providing general training in SC behavior and expectations (p. 131) and, where observation or problems reveal specific needs, targeted individual (or small group) SC training. (p. 93)
Our Perspective
This publication is hardly a great victory for SC; the report provides only the most basic description of the SC imperative. Its major contribution is that it recognizes that itinerant nuclear workers deserve the same safety and protection considerations as other workers at a nuclear facility.
Back in the bad old days, I was around nuclear organizations where their own employees represented the highest social class, contractors were regarded as replaceable parts, and nomadic workers were not exactly expendable but were considered more responsible for managing their own safety and exposure than permanent personnel.
One can make some judgment about a society’s worth by observing how it treats its lowest status members—the poor, the homeless, the refugee, the migrant worker. Nuclear itinerant workers deserve to be respected and treated like the other members of a facility’s team.
* International Atomic Energy Agency, “Radiation protection of itinerant workers,” Safety reports series no. 84 (Vienna, 2015).
Posted by
Lewis Conner
1 comments. Click to view/add.
Labels:
Management,
References,
Safety Culture
Sunday, January 10, 2016
Targeted Safety Culture Assessment at Columbia Generating Station
![]() |
Columbia Generating Station |
Energy Northwest (CGS’s owner) opted for the NRC’s Alternative Dispute Resolution (ADR) process. The agreed-upon corrective actions and penances are typical for ADR settlements: conduct a common cause evaluation, install new cameras and increase supervision if the cameras aren’t working, revise and present training, prepare a statement on willful misconduct’s consequences and have personnel sign it, prepare a "lessons learned" presentation for plant personnel and an industry gathering (aka public atonement), revise procedures and conduct a targeted nuclear safety culture (SC) assessment of the security organization at CGS. Oh, and pay a $35K fine.
Our Perspective
The security SC assessment caught our eye because it is being conducted by a law firm, not a culture assessment specialist. Maybe that’s because the subject is security-related, therefore sensitive, and this approach will ensure the report will never be made public. It also ensures that the report will focus on the previously identified “bad apples” (who no longer work at the plant) and the agreed-upon ADR actions; the assessment will not raise any awkward management or systemic issues.
* A. Cary, “Energy Northwest pays fine over Richland nuclear security,” Tri-City Herald (Jan. 5, 2015.)
** A. Vegel (NRC) to M. Reddermann (Energy NW), Columbia Generating Station – NRC Security Inspection Report 05000397/2015405 and NRC Investigation Report No. 4-2014-009 (June 25, 2015). ADAMS ML15176A599. M. Dapas (NRC) to M. Reddermann (Energy NW), Confirmatory Order - NRC Security Inspection Report 05000397/2015407 AND NRC Investigation Report 4-2014-009 Columbia Generating Station (Sept. 28, 2015). ADAMS ML15271A078.
Posted by
Lewis Conner
1 comments. Click to view/add.
Labels:
Assessment,
Safety Culture
Monday, January 4, 2016
How Top Management Decisions Shape Culture
A brief article* in the December 2015 The Atlantic magazine asks “What was VW thinking?” then reviews a few classic business cases to show how top management, often CEO, decisions can percolate down through an organization, sometimes with appalling results. The author also describes a couple of mechanisms by which bad decision making can be institutionalized. We’ll start with the cases.
Johnson & Johnson had a long-standing credo that outlined its responsibilities to those who used its products. In 1979, the CEO reinforced the credo’s relevance to J&J’s operations. When poisoned Tylenol showed up in stores, J&J did not hesitate to recall product, warn people against taking Tylenol and absorb a $100 million hit. This is often cited as an example of a corporation doing the right thing.
B. F. Goodrich promised an Air Force contractor an aircraft brake that was ultralight and ultracheap. The only problem was it didn’t work, in fact it melted. Only by massively finagling the test procedures and falsifying test results did they get the brake qualified. The Air Force discovered the truth when they reviewed the raw test data. A Goodrich whistleblower announced his resignation over the incident but was quickly fired by the company.
Ford President Lee Iacocca wanted the Pinto to be light, inexpensive and available in 25 months. The gas tank’s position made the vehicle susceptible to fire when the car was rear-ended but repositioning the gas tank would have delayed the roll-out schedule. Ford delayed addressing the problem, resulting in at least one costly lawsuit and bad publicity for the company.
With respect to institutional mechanisms, the author reviews Diane Vaughan’s normalization of deviance and how it led to the space shuttle Challenger disaster. To promote efficiency, organizations adopt scripts that tell members how to handle various situations. Scripts provide a rationale for decisions, which can sometimes be the wrong decisions. In Vaughan’s view, scripts can “expand like an elastic waistband” to accommodate more and more deviation from standards or norms. Scripts are important organizational culture artifacts. We have often referred to Vaughan’s work on Safetymatters.
The author closes with a quote: “Culture starts at the top, . . . Employees will see through empty rhetoric and will emulate the nature of top-management decision making . . . ” The speaker? Andrew Fastow, Enron’s former CFO and former federal prison inmate.
Our Perspective
I used to use these cases when I was teaching ethics to business majors at a local university. Students would say they would never do any of the bad stuff. I said they probably would, especially once they had mortgages (today it’s student debt), families and career aspirations. It’s hard to put up a fight when the organization has so accepted the script they actually believe they are doing the right thing. And don’t even think about being a whistleblower unless you’ve got money set aside and a good lawyer lined up.
Bottom line: This is worth a quick read. It illustrates the importance of senior management’s decisions as opposed to its sloganeering or other empty leadership behavior.
* J. Useem, “What Was Volkswagen Thinking? On the origins of corporate evil—and idiocy,” The Atlantic (Dec. 2015), pp.26-28.
Johnson & Johnson had a long-standing credo that outlined its responsibilities to those who used its products. In 1979, the CEO reinforced the credo’s relevance to J&J’s operations. When poisoned Tylenol showed up in stores, J&J did not hesitate to recall product, warn people against taking Tylenol and absorb a $100 million hit. This is often cited as an example of a corporation doing the right thing.
B. F. Goodrich promised an Air Force contractor an aircraft brake that was ultralight and ultracheap. The only problem was it didn’t work, in fact it melted. Only by massively finagling the test procedures and falsifying test results did they get the brake qualified. The Air Force discovered the truth when they reviewed the raw test data. A Goodrich whistleblower announced his resignation over the incident but was quickly fired by the company.
Ford President Lee Iacocca wanted the Pinto to be light, inexpensive and available in 25 months. The gas tank’s position made the vehicle susceptible to fire when the car was rear-ended but repositioning the gas tank would have delayed the roll-out schedule. Ford delayed addressing the problem, resulting in at least one costly lawsuit and bad publicity for the company.
With respect to institutional mechanisms, the author reviews Diane Vaughan’s normalization of deviance and how it led to the space shuttle Challenger disaster. To promote efficiency, organizations adopt scripts that tell members how to handle various situations. Scripts provide a rationale for decisions, which can sometimes be the wrong decisions. In Vaughan’s view, scripts can “expand like an elastic waistband” to accommodate more and more deviation from standards or norms. Scripts are important organizational culture artifacts. We have often referred to Vaughan’s work on Safetymatters.
The author closes with a quote: “Culture starts at the top, . . . Employees will see through empty rhetoric and will emulate the nature of top-management decision making . . . ” The speaker? Andrew Fastow, Enron’s former CFO and former federal prison inmate.
Our Perspective
I used to use these cases when I was teaching ethics to business majors at a local university. Students would say they would never do any of the bad stuff. I said they probably would, especially once they had mortgages (today it’s student debt), families and career aspirations. It’s hard to put up a fight when the organization has so accepted the script they actually believe they are doing the right thing. And don’t even think about being a whistleblower unless you’ve got money set aside and a good lawyer lined up.
Bottom line: This is worth a quick read. It illustrates the importance of senior management’s decisions as opposed to its sloganeering or other empty leadership behavior.
* J. Useem, “What Was Volkswagen Thinking? On the origins of corporate evil—and idiocy,” The Atlantic (Dec. 2015), pp.26-28.
Posted by
Lewis Conner
6
comments. Click to view/add.
Labels:
Decisions,
Normalization of Deviance,
Vaughan,
VW
Saturday, December 26, 2015
NRC IG Reviews DNFSB Organizational Culture and Climate
Summary of Methods and Results
The study’s methodology is familiar: Review relevant past reports, develop a survey instrument based on employee interviews and focus groups, administer the survey to all employees and interpret the results.
Themes (issues, shortcomings) brought up during the interviews included DNFSB’s handling of change management, communication, personnel development, leadership, internal procedures and performance management (aka personal recognition). (pp. 6-7)
The report compared the DNFSB survey results with three external norms: a cross-section of U.S. industry, U.S. employees working in Research and Development, and industries that have experienced significant changes with widespread employee impact. The last group consists of organizations under stress because of reorganization, bankruptcy, layoffs, etc. (p. 14)
The report’s summary is not encouraging: “the general trend shows an unfavorable comparison for the DNFSB on all three external benchmarks, . . . Also, many employees feel they do not have the right tools and resources. Along with that, 38 percent of employees say they plan to leave DNFSB in the next year.” (p. 4)
The employee survey had 14 categories, higher scores mean greater respondent agreement with positive traits. Analyzing the survey responses in three different dimensions yielded one typical and two unusual results. In our opinion, they suggest uneven DNFSB management effectiveness across the organization.
Across organizational groups, the General Manager and Admin/ Support groups scored above DNFSB averages on most categories; the Technical Director and Engineering groups scored below DNFSB averages on most categories. (p. 13) In our experience, this is no surprise; bosses and admin people are usually more satisfied (or less dissatisfied) than the folks who have to get the work done.
Looking at employee tenure, employees with the shortest tenure scored the highest (this is typical) then the scores go downhill. The longest tenured employees have the lowest scores, which is unusual; most organizations have a U-shaped curve, with newcomers and old timers the most satisfied. (p. 14)
By pay (GS or DN) level, “what is atypical is that the lowest-scoring group is not the lowest-level group, but instead the mid-level group, . . .” (p. 15)
The report identifies Sustainable Engagement (SE)** as a key category. Using regression analysis, the authors identified five drivers (other survey categories) of SE, two that had acceptable survey scores and three that are candidates for organizational improvement interventions: communication, leadership and performance management. (p.17) This is as close the report comes to suggesting what the DNFSB might actually do about their problems.
Our Perspective
This report recognizes that DNFSB has significant challenges but it contains zero surprises. It’s not even news. The same or similar ground was covered by a Dec. 2014 organizational study performed for the DNFSB which we reviewed on Feb. 6, 2015.
Problems mentioned in the 2014 report include board dysfunctionality, communications, performance recognition, change management, frequent disruptive organizational changes, and the lack of management and leadership competence. The 2014 report included extensive discussion of possible organizational interventions and other corrective actions.
The NRC IG already knew change management was a serious challenge facing the DNFSB; it was mentioned in an Oct. 2014 IG report.*** That report was likely the impetus for this 2015 study.
The DNFSB has been in apparent disarray for over a year. New members have been appointed to the Board this year, including a new chairman. It remains to be seen whether they can address the internal challenges and, more importantly, provide meaningful recommendations to their single client, the U.S. Department of Defense.
Bottom line: This NRC IG consultant’s report adds little value to understanding the DNFSB’s organizational issues or developing effective corrective actions.
* Towers Watson, “DNFSB 2015 Culture and Climate Survey: Executive Overview of Key Findings” (Aug. 2015). ADAMS ML15245A515. Thanks to John Hockert for publicizing this report on the LinkedIn Nuclear Safety Culture forum.
** Sustainable Engagement is defined as follows: “Assesses the level of DNFSB employees’ connection to the organization, marked by being proud to work at DNFSB, committing effort to achieve the goals (being engaged) having an environment that support productivity (being enabled) and maintaining personal well-being (feeling energized).” (p. 9)
** H.T. Bell (NRC) to Chairman Winokur (DNFSB), “Inspector General’s Assessment of the Most Serious Management and Performance Challenges Facing the Defense Nuclear Facilities Safety Board,” DNFSB-OIG-15-A-01 (Oct. 1, 2014). ADAMS ML14274A247.
Posted by
Lewis Conner
4
comments. Click to view/add.
Sunday, December 20, 2015
Fukushima and Volkswagen: Systemic Similarities and Observations for the U.S. Nuclear Industry
![]() |
Fukushima |
![]() |
VW Logo (Source: Wikipedia) |
An Accommodating Regulator
The Japanese nuclear regulator did not provide effective oversight of Tokyo Electric Power Co. One aspect of this was TEPCO’s relative power over the regulator because of TEPCO’s political influence at the national level. This was a case of complete regulatory capture.
The German auto regulator doesn’t provide effective oversight either. “[T]he regulatory agency for motor vehicles in Germany is deliberately starved for resources by political leaders eager to protect the country’s powerful automakers, . . .” (NYT 12-9-15) This looks more like regulatory impotence than capture but the outcome is the same.
In the U.S., critics have accused the NRC of being captured by industry. We disagree but have noted that the regulator and licensees working together over long periods of time, even across the table, can lead to familiarity, common language and indiscernible mutual adjustments.
Deference to Senior Managers
Traditionally in Japan, people in senior positions are treated as if they have the right answers, no matter what the facts facing a lower-ranking employee might suggest. Members of society go along to get along. As we said in an Aug. 7, 2014 post, “harmony was so valued that no one complained that Fukushima site protection was clearly inadequate and essential emergency equipment was exposed to grave hazards.”
The Volkswagen culture was a different but had the same effect. The CEO managed through fear. At VW, “subordinates were fearful of contradicting their superiors and were afraid to admit failure.” A former CEO “was known for publicly dressing down subordinates . . .” (NYT 12-13-15)
In the U.S., INPO’s singled-minded focus on the unrivaled importance of leadership can, if practiced by the wrong kind of people, lead to a suppression of dissent, facts that contradict the party line and the questioning attitude that is vital to maintain safe facilities.
Companies Not Responsible to All Legitimate Stakeholders
In the Fukushima plant design, TEPCO gave short shrift to local communities, their citizens, governments and first responders, ultimately exposing them to profound hazards. TEPCO’s behavior also impacted the international nuclear power community, where any significant incident at one operator is a problem for them all.
Volkswagen’s isolation from public responsibilities is facilitated by its structure. Only 12% of the company is held by independent shareholders. Like other large German companies, the labor unions hold half the seats on VW’s board. Two more seats are held by the regional government (a minority owner) which in practice cannot vote against labor. So the union effectively controls the board. (NYT 12-13-15)
We have long complained about the obsessive secrecy practiced by the U.S. nuclear industry, particularly in its relations with its self-regulator, INPO. It is not a recipe for building trust and confidence with the public, an affected and legitimate stakeholder.
Our Perspective
The TEPCO safety culture (SC) was unacceptably weak. And its management culture simply ignored inconvenient facts.
Volkswagen’s culture has valued technical competence and ambition, and apparently has lower regard for regulations (esp. foreign, i.e., U.S. ones) and other rules of the game.
We are not saying the gross problems of either company infect the U.S. nuclear industry. But the potential is there. The industry has experienced events that suggest the presence of human, technical and systemic shortcomings. For a general illustration of inadequate management effectiveness, look at Entergy’s series of SC problems. For a specific case, remember Davis-Besse, where favoring production over safety took the plant to the brink of a significant failure. Caveat nuclear.
* See, for example: J. Ewing and G. Bowley, “The Engineering of Volkswagen’s Aggressive Ambition,” New York Times (Dec. 13, 2015). J. Ewing, “Volkswagen Terms One Emissions Problem Smaller Than Expected,” New York Times (Dec. 9, 2015).
Posted by
Lewis Conner
1 comments. Click to view/add.
Labels:
Fukushima,
Regulatory Capture,
VW
Tuesday, November 17, 2015
Foolproof by Greg Ip: Insights for the Nuclear Industry
This book* is primarily about systemic lessons learned from the 2008 U.S. financial crisis and, to a lesser extent, various European euro crises. Some of the author’s observations also apply to the nuclear industry.
Ip’s overarching thesis is that steps intended to protect a system, e.g., a national or global financial system, may over time lead to over-confidence, increased risk-taking and eventual instability. Stability breeds complacency.** As we know, a well-functioning system creates a series of successful outcomes, a line of dynamic non-events. But that dynamic includes gradual changes to the system, e.g., innovation or adaptation to the environment, that may increase systemic risk and result in a new crisis or unintended consequences
He sees examples that evidence his thesis in other fields. For automobiles, the implementation of anti-lock braking systems leads some operators to drive more recklessly. In football, better helmets mean increased use of the head as a weapon and more concussions and spinal injuries. For forest fires, a century of fire suppression has led to massive fuel build-ups and more people moving into forested areas. For flood control, building more and higher levees has led to increased economic development in historically flood-prone areas. As a result, both fires and floods can have huge financial losses when they eventually occur. In all cases, well-intentioned system “improvements” lead to increased confidence (aka loss of fear) and risk-taking, both obvious and implicit. In short, “If the surroundings seem safer, the systems tolerate more risk.” (p. 18)
Ip uses the nuclear industry to illustrate how society can create larger issues elsewhere in a system when it effects local responses to a perceived problem. Closing down nuclear plants after an accident (e.g., Fukushima) or because of green politics does not remove the demand for electric energy. To the extent the demand shortfall is made up with hydrocarbons, additional people will suffer from doing the mining, drilling, processing, etc. and the climate will be made worse.
He cites the aviation industry as an example of a system where near-misses are documented and widely shared in an effort to improve overall system safety. He notes that the few fatal accidents that occur in commercial aviation serve both as lessons learned and keep those responsible for operating the system (pilots and controllers) on their toes.
He also makes an observation about aviation that could be applied to the nuclear industry: “It is almost impossible to improve a system that never has an accident. . . . regulators are unlikely to know whether anything they propose now will have provable benefits; it also means that accidents will increasingly be of the truly mysterious, unimaginable variety . . .” (p. 252)
Speaking of finance, Ip says “A huge part of what the financial system does is try to create the fact—and at times the illusion—of safety. Usually, it succeeds; . . . On those rare occasions when it fails, the result is panic.” (p. 86) Could this description also apply to the nuclear industry?
Our Perspective
Ip’s search for systemic, dynamic factors to explain the financial crisis echoes the type of analysis we’ve been promoting for years. Like us, he recognizes that people hold different world views of the same system. Ip contrasts the engineers and the ecologists: “Engineers satisfy our desire for control, . . . civilization’s needs to act, to do something, . . .” (p. 278) Ecologists believe “it’s the nature of risk to find the vulnerabilities we missed, to hit when least expected, to exploit the very trust in safety we so assiduously cultivate with all our protection . . .” (p. 279)
Ip’s treatment of the nuclear industry, while positive, is incomplete and somewhat simplistic. It’s really just an example, not an industry analysis. His argument that shutting down nuclear plants exacerbates climate harm could have come from the NEI playbook. He ignores the impact of renewables, efficiency and conservation.
He doesn’t discuss the nuclear industry’s penchant for secrecy, but we have and believe it feeds the public’s uncertainty about the industry's safety. As Ip notes, “People who crave certainty cannot tolerate even a slight increase in uncertainty, and so they flee not just the bad banks, the bad paper, and the bad country, but everything that resembles them, . . .” (p. 261) If a system that is assumed [or promoted] to be safe has a crisis, even a local one, the result is often panic. (p. 62)
He mentions high reliability organizations (HROs) focusing on their avoiding catastrophe and “being a little bit scared all of the time.” (p. 242) He does not mention that some of the same systemic factors of the financial system are at work in the world of HROs, including exposure to the corrosive effects of complacency and system drift. (p. 242)
Bottom line: Read Foolproof if you have an interest in an intelligible assessment of the financial crisis. And remember: “Fear serves a purpose: it keeps us out of trouble.” (p. 19) “. . . but it can keep us from taking risks that could make us better off.” (p. 159)
* G. Ip, Foolproof (New York: Little, Brown, 2015). Ip is a finance and economics journalist, currently with the Wall Street Journal and previously with The Economist.
** He quotes a great quip from Larry Summers: “Complacency is a self-denying prophecy.” Ip adds, “If everyone worried about complacency, no one would succumb to it.” (p.263)
Ip’s overarching thesis is that steps intended to protect a system, e.g., a national or global financial system, may over time lead to over-confidence, increased risk-taking and eventual instability. Stability breeds complacency.** As we know, a well-functioning system creates a series of successful outcomes, a line of dynamic non-events. But that dynamic includes gradual changes to the system, e.g., innovation or adaptation to the environment, that may increase systemic risk and result in a new crisis or unintended consequences
He sees examples that evidence his thesis in other fields. For automobiles, the implementation of anti-lock braking systems leads some operators to drive more recklessly. In football, better helmets mean increased use of the head as a weapon and more concussions and spinal injuries. For forest fires, a century of fire suppression has led to massive fuel build-ups and more people moving into forested areas. For flood control, building more and higher levees has led to increased economic development in historically flood-prone areas. As a result, both fires and floods can have huge financial losses when they eventually occur. In all cases, well-intentioned system “improvements” lead to increased confidence (aka loss of fear) and risk-taking, both obvious and implicit. In short, “If the surroundings seem safer, the systems tolerate more risk.” (p. 18)
Ip uses the nuclear industry to illustrate how society can create larger issues elsewhere in a system when it effects local responses to a perceived problem. Closing down nuclear plants after an accident (e.g., Fukushima) or because of green politics does not remove the demand for electric energy. To the extent the demand shortfall is made up with hydrocarbons, additional people will suffer from doing the mining, drilling, processing, etc. and the climate will be made worse.
He cites the aviation industry as an example of a system where near-misses are documented and widely shared in an effort to improve overall system safety. He notes that the few fatal accidents that occur in commercial aviation serve both as lessons learned and keep those responsible for operating the system (pilots and controllers) on their toes.
He also makes an observation about aviation that could be applied to the nuclear industry: “It is almost impossible to improve a system that never has an accident. . . . regulators are unlikely to know whether anything they propose now will have provable benefits; it also means that accidents will increasingly be of the truly mysterious, unimaginable variety . . .” (p. 252)
Speaking of finance, Ip says “A huge part of what the financial system does is try to create the fact—and at times the illusion—of safety. Usually, it succeeds; . . . On those rare occasions when it fails, the result is panic.” (p. 86) Could this description also apply to the nuclear industry?
Our Perspective
Ip’s search for systemic, dynamic factors to explain the financial crisis echoes the type of analysis we’ve been promoting for years. Like us, he recognizes that people hold different world views of the same system. Ip contrasts the engineers and the ecologists: “Engineers satisfy our desire for control, . . . civilization’s needs to act, to do something, . . .” (p. 278) Ecologists believe “it’s the nature of risk to find the vulnerabilities we missed, to hit when least expected, to exploit the very trust in safety we so assiduously cultivate with all our protection . . .” (p. 279)
Ip’s treatment of the nuclear industry, while positive, is incomplete and somewhat simplistic. It’s really just an example, not an industry analysis. His argument that shutting down nuclear plants exacerbates climate harm could have come from the NEI playbook. He ignores the impact of renewables, efficiency and conservation.
He doesn’t discuss the nuclear industry’s penchant for secrecy, but we have and believe it feeds the public’s uncertainty about the industry's safety. As Ip notes, “People who crave certainty cannot tolerate even a slight increase in uncertainty, and so they flee not just the bad banks, the bad paper, and the bad country, but everything that resembles them, . . .” (p. 261) If a system that is assumed [or promoted] to be safe has a crisis, even a local one, the result is often panic. (p. 62)
He mentions high reliability organizations (HROs) focusing on their avoiding catastrophe and “being a little bit scared all of the time.” (p. 242) He does not mention that some of the same systemic factors of the financial system are at work in the world of HROs, including exposure to the corrosive effects of complacency and system drift. (p. 242)
Bottom line: Read Foolproof if you have an interest in an intelligible assessment of the financial crisis. And remember: “Fear serves a purpose: it keeps us out of trouble.” (p. 19) “. . . but it can keep us from taking risks that could make us better off.” (p. 159)
* G. Ip, Foolproof (New York: Little, Brown, 2015). Ip is a finance and economics journalist, currently with the Wall Street Journal and previously with The Economist.
** He quotes a great quip from Larry Summers: “Complacency is a self-denying prophecy.” Ip adds, “If everyone worried about complacency, no one would succumb to it.” (p.263)
Posted by
Lewis Conner
0
comments. Click to view/add.
Labels:
Mental Model,
References,
System Dynamics,
Systems View
Subscribe to:
Posts (Atom)