Monday, January 25, 2016

IAEA Urges Stronger Nuclear Safety Culture in Japan

Fukushima
The International Atomic Energy Agency (IAEA) recently completed a peer review of Japan's Nuclear Regulation Authority (NRA), a regulatory agency established in the aftermath of the 2011 Fukushima disaster.  Highlights of the review were discussed at an IAEA press conference.*

The IAEA review team praised the NRA’s progress in various areas, such as demonstrating independence and transparency, and made suggestions and recommendations for further improvement, primarily in the area of NRA staff recruiting and development.

The IAEA team also mentioned safety culture (SC), recommending “the NRA and nuclear licensees ‘continue to strengthen the promotion of safety culture, including by fostering a questioning attitude’.”

Our Perspective

We look forward to the IAEA’s final report which is due in about three months.  We are especially interested in seeing if there is comprehensive discussion and specific direction with respect to “fostering a questioning attitude.”  The Japanese nuclear industry in general and TEPCO (Fukushima’s owner) in particular certainly need to cultivate employees’ willingness to develop and consider open-ended questions such as “what if?” and “what can go wrong?”

More importantly, they also need to instill the necessary backbone to stand up in front of the bosses and ask tough questions and demand straight answers.  Lots of folks probably knew the Fukushima seawall wasn’t high enough and the emergency equipment in the basement was subject to flooding but everyone went along with the program.  That’s what has to change to create a stronger SC.


*  “IAEA praises reform of Japan's nuclear regulator,” World Nuclear News (Jan. 22, 2016).

Sunday, January 17, 2016

A Nuclear Safety Culture for Itinerant Workers

IAEA has published “Radiation Protection of Itinerant Workers”* a report that describes management responsibilities and practices to protect and monitor itinerant workers who are exposed to ionizing radiation.  “Itinerant workers” are people who work at different locations “and are not employees of the management of the facility where they are working. The itinerant workers may be self-employed or employed by a contractor . . .” (p. 4)  In the real world, such employees have many different names including nuclear nomads, glow boys and jumpers.

The responsibility for itinerant workers’ safety and protection is shared among various organizations and the individual.  “The primary responsibility for the protection of workers lies with the management of the operating organization responsible for the facilities . . . however, the employer of the worker (as well as the worker) also bear certain responsibilities.” (p. 2)

Safety culture (SC) is specifically mentioned in the IAEA report.  One basic management responsibility is to promote and maintain a robust SC at all organizational levels. (p. 11)  Specific responsibilities include providing general training in SC behavior and expectations (p. 131) and, where observation or problems reveal specific needs, targeted individual (or small group) SC training. (p. 93)

Our Perspective

This publication is hardly a great victory for SC; the report provides only the most basic description of the SC imperative.  Its major contribution is that it recognizes that itinerant nuclear workers deserve the same safety and protection considerations as other workers at a nuclear facility. 

Back in the bad old days, I was around nuclear organizations where their own employees represented the highest social class, contractors were regarded as replaceable parts, and nomadic workers were not exactly expendable but were considered more responsible for managing their own safety and exposure than permanent personnel.

One can make some judgment about a society’s worth by observing how it treats its lowest status members—the poor, the homeless, the refugee, the migrant worker.  Nuclear itinerant workers deserve to be respected and treated like the other members of a facility’s team.


*  International Atomic Energy Agency, “Radiation protection of itinerant workers,” Safety reports series no. 84 (Vienna, 2015).

Sunday, January 10, 2016

Targeted Safety Culture Assessment at Columbia Generating Station

Columbia Generating Station
The Columbia Generating Station (CGS) got into trouble with the NRC when two members of the security department were found to have been willfully inattentive to their duties on multiple occasions over three years (2012-2014).  What they were doing was not disclosed because it was a security-related matter.  The situation was summarized in a recent newspaper article* and the relevant NRC documents** provide some additional details.

Energy Northwest (CGS’s owner) opted for the NRC’s Alternative Dispute Resolution (ADR) process.  The agreed-upon corrective actions and penances are typical for ADR settlements: conduct a common cause evaluation, install new cameras and increase supervision if the cameras aren’t working, revise and present training, prepare a statement on willful misconduct’s consequences and have personnel sign it, prepare a "lessons learned" presentation for plant personnel and an industry gathering (aka public atonement), revise procedures and conduct a targeted nuclear safety culture (SC) assessment of the security organization at CGS.  Oh, and pay a $35K fine.

Our Perspective

The security SC assessment caught our eye because it is being conducted by a law firm, not a culture assessment specialist.  Maybe that’s because the subject is security-related, therefore sensitive, and this approach will ensure the report will never be made public.  It also ensures that the report will focus on the previously identified “bad apples” (who no longer work at the plant) and the agreed-upon ADR actions; the assessment will not raise any awkward management or systemic issues.


*  A. Cary, “Energy Northwest pays fine over Richland nuclear security,” Tri-City Herald (Jan. 5, 2015.)

**  A. Vegel (NRC) to M. Reddermann (Energy NW), Columbia Generating Station – NRC Security Inspection Report 05000397/2015405 and NRC Investigation Report No. 4-2014-009 (June 25, 2015).  ADAMS ML15176A599.  M. Dapas (NRC) to M. Reddermann (Energy NW), Confirmatory Order - NRC Security Inspection Report 05000397/2015407 AND NRC Investigation Report 4-2014-009 Columbia Generating Station (Sept. 28, 2015).  ADAMS ML15271A078.

Monday, January 4, 2016

How Top Management Decisions Shape Culture

A brief article* in the December 2015 The Atlantic magazine asks “What was VW thinking?” then reviews a few classic business cases to show how top management, often CEO, decisions can percolate down through an organization, sometimes with appalling results.  The author also describes a couple of mechanisms by which bad decision making can be institutionalized.  We’ll start with the cases.

Johnson & Johnson had a long-standing credo that outlined its responsibilities to those who used its products.  In 1979, the CEO reinforced the credo’s relevance to J&J’s operations.  When poisoned Tylenol showed up in stores, J&J did not hesitate to recall product, warn people against taking Tylenol and absorb a $100 million hit.  This is often cited as an example of a corporation doing the right thing. 

B. F. Goodrich promised an Air Force contractor an aircraft brake that was ultralight and ultracheap.  The only problem was it didn’t work, in fact it melted.  Only by massively finagling the test procedures and falsifying test results did they get the brake qualified.  The Air Force discovered the truth when they reviewed the raw test data.  A Goodrich whistleblower announced his resignation over the incident but was quickly fired by the company.  

Ford President Lee Iacocca wanted the Pinto to be light, inexpensive and available in 25 months.  The gas tank’s position made the vehicle susceptible to fire when the car was rear-ended but repositioning the gas tank would have delayed the roll-out schedule.  Ford delayed addressing the problem, resulting in at least one costly lawsuit and bad publicity for the company.

With respect to institutional mechanisms, the author reviews Diane Vaughan’s normalization of deviance and how it led to the space shuttle Challenger disaster.  To promote efficiency, organizations adopt scripts that tell members how to handle various situations.  Scripts provide a rationale for decisions, which can sometimes be the wrong decisions.  In Vaughan’s view, scripts can “expand like an elastic waistband” to accommodate more and more deviation from standards or norms.  Scripts are important organizational culture artifacts.  We have often referred to Vaughan’s work on Safetymatters.

The author closes with a quote: “Culture starts at the top, . . . Employees will see through empty rhetoric and will emulate the nature of top-management decision making . . . ”  The speaker?  Andrew Fastow, Enron’s former CFO and former federal prison inmate.

Our Perspective

I used to use these cases when I was teaching ethics to business majors at a local university.  Students would say they would never do any of the bad stuff.  I said they probably would, especially once they had mortgages (today it’s student debt), families and career aspirations.  It’s hard to put up a fight when the organization has so accepted the script they actually believe they are doing the right thing.  And don’t even think about being a whistleblower unless you’ve got money set aside and a good lawyer lined up.

Bottom line: This is worth a quick read.  It illustrates the importance of senior management’s decisions as opposed to its sloganeering or other empty leadership behavior.


*  J. Useem, “What Was Volkswagen Thinking?  On the origins of corporate evil—and idiocy,”  The Atlantic (Dec. 2015), pp.26-28.

Saturday, December 26, 2015

NRC IG Reviews DNFSB Organizational Culture and Climate

The Nuclear Regulatory Commission Inspector General (IG) provides IG services to the Defense Nuclear Facilities Safety Board (DNFSB), an independent government agency.  The DNFSB organizational culture and climate study* reviewed here was performed for the NRC IG by an outside consultant.

Summary of Methods and Results

The study’s methodology is familiar: Review relevant past reports, develop a survey instrument based on employee interviews and focus groups, administer the survey to all employees and interpret the results.

Themes (issues, shortcomings) brought up during the interviews included DNFSB’s handling of change management, communication, personnel development, leadership, internal procedures and performance management (aka personal recognition). (pp. 6-7)

The report compared the DNFSB survey results with three external norms: a cross-section of U.S. industry, U.S. employees working in Research and Development, and industries that have experienced significant changes with widespread employee impact.  The last group consists of organizations under stress because of reorganization, bankruptcy, layoffs, etc. (p. 14)

The report’s summary is not encouraging: “the general trend shows an unfavorable comparison for the DNFSB on all three external benchmarks, . . . Also, many employees feel they do not have the right tools and resources.  Along with that, 38 percent of employees say they plan to leave DNFSB in the next year.” (p. 4)

The employee survey had 14 categories, higher scores mean greater respondent agreement with positive traits.  Analyzing the survey responses in three different dimensions yielded one typical and two unusual results.  In our opinion, they suggest uneven DNFSB management effectiveness across the organization.

Across organizational groups, the General Manager and Admin/ Support groups scored above DNFSB averages on most categories; the Technical Director and Engineering groups scored below DNFSB averages on most categories. (p. 13)  In our experience, this is no surprise; bosses and admin people are usually more satisfied (or less dissatisfied) than the folks who have to get the work done.

Looking at employee tenure, employees with the shortest tenure scored the highest (this is typical) then the scores go downhill.  The longest tenured employees have the lowest scores, which is unusual; most organizations have a U-shaped curve, with newcomers and old timers the most satisfied. (p. 14)

By pay (GS or DN) level, “what is atypical is that the lowest-scoring group is not the lowest-level group, but instead the mid-level group, . . .” (p. 15)

The report identifies Sustainable Engagement (SE)** as a key category.  Using regression analysis, the authors identified five drivers (other survey categories) of SE, two that had acceptable survey scores and three that are candidates for organizational improvement interventions: communication, leadership and performance management. (p.17)  This is as close the report comes to suggesting what the DNFSB might actually do about their problems.

Our Perspective 


This report recognizes that DNFSB has significant challenges but it contains zero surprises.  It’s not even news.  The same or similar ground was covered by a Dec. 2014 organizational study performed for the DNFSB which we reviewed on Feb. 6, 2015.

Problems mentioned in the 2014 report include board dysfunctionality, communications, performance recognition, change management, frequent disruptive organizational changes, and the lack of management and leadership competence.  The 2014 report  included extensive discussion of possible organizational interventions and other corrective actions.

The NRC IG already knew change management was a serious challenge facing the DNFSB; it was mentioned in an Oct. 2014 IG report.***  That report was likely the impetus for this 2015 study.

The DNFSB has been in apparent disarray for over a year.  New members have been appointed to the Board this year, including a new chairman.  It remains to be seen whether they can address the internal challenges and, more importantly, provide meaningful recommendations to their single client, the U.S. Department of Defense.

Bottom line: This NRC IG consultant’s report adds little value to understanding the DNFSB’s organizational issues or developing effective corrective actions. 


*  Towers Watson, “DNFSB 2015 Culture and Climate Survey: Executive Overview of Key Findings” (Aug. 2015).  ADAMS ML15245A515.  Thanks to John Hockert for publicizing this report on the LinkedIn Nuclear Safety Culture forum.

**  Sustainable Engagement is defined as follows: “Assesses the level of DNFSB employees’ connection to the organization, marked by being proud to work at DNFSB, committing effort to achieve the goals (being engaged) having an environment that support productivity (being enabled) and maintaining personal well-being (feeling energized).” (p. 9)

**  H.T. Bell (NRC) to Chairman Winokur (DNFSB), “Inspector General’s Assessment of the Most Serious Management and Performance Challenges Facing the Defense Nuclear Facilities Safety Board,” DNFSB-OIG-15-A-01 (Oct. 1, 2014).  ADAMS ML14274A247.

Sunday, December 20, 2015

Fukushima and Volkswagen: Systemic Similarities and Observations for the U.S. Nuclear Industry

Fukushima
VW Logo (Source: Wikipedia)
Recent New York Times articles* have described the activities, culture and context of Volkswagen, currently mired in scandal.  The series inspired a Yogi Berra moment: “It’s deja vu all over again.”  Let’s look at some of the circumstances that affected Fukushima and Volkswagen and see if they give us any additional insights into the risk profile of the U.S. commercial nuclear industry.

An Accommodating Regulator

The Japanese nuclear regulator did not provide effective oversight of Tokyo Electric Power Co.  One aspect of this was TEPCO’s relative power over the regulator because of TEPCO’s political influence at the national level.  This was a case of complete regulatory capture.

The German auto regulator doesn’t provide effective oversight either.  “[T]he regulatory agency for motor vehicles in Germany is deliberately starved for resources by political leaders eager to protect the country’s powerful automakers, . . .” (NYT 12-9-15)  This looks more like regulatory impotence than capture but the outcome is the same.

In the U.S., critics have accused the NRC of being captured by industry.  We disagree but have noted that the regulator and licensees working together over long periods of time, even across the table, can lead to familiarity, common language and indiscernible mutual adjustments. 

Deference to Senior Managers

Traditionally in Japan, people in senior positions are treated as if they have the right answers, no matter what the facts facing a lower-ranking employee might suggest.  Members of society go along to get along.  As we said in an Aug. 7, 2014 post, “harmony was so valued that no one complained that Fukushima site protection was clearly inadequate and essential emergency equipment was exposed to grave hazards.” 

The Volkswagen culture was a different but had the same effect.  The CEO managed through fear.  At VW, “subordinates were fearful of contradicting their superiors and were afraid to admit failure.”  A former CEO “was known for publicly dressing down subordinates . . .”  (NYT 12-13-15)

In the U.S., INPO’s singled-minded focus on the unrivaled importance of leadership can, if practiced by the wrong kind of people, lead to a suppression of dissent, facts that contradict the party line and the questioning attitude that is vital to maintain safe facilities.

Companies Not Responsible to All Legitimate Stakeholders

In the Fukushima plant design, TEPCO gave short shrift to local communities, their citizens, governments and first responders, ultimately exposing them to profound hazards.  TEPCO’s behavior also impacted the international nuclear power community, where any significant incident at one operator is a problem for them all.

Volkswagen’s isolation from public responsibilities is facilitated by its structure.  Only 12% of the company is held by independent shareholders.  Like other large German companies, the labor unions hold half the seats on VW’s board.  Two more seats are held by the regional government (a minority owner) which in practice cannot vote against labor. So the union effectively controls the board. (NYT 12-13-15)

We have long complained about the obsessive secrecy practiced by the U.S. nuclear industry, particularly in its relations with its self-regulator, INPO.  It is not a recipe for building trust and confidence with the public, an affected and legitimate stakeholder.

Our Perspective

The TEPCO safety culture (SC) was unacceptably weak.  And its management culture simply ignored inconvenient facts.

Volkswagen’s culture has valued technical competence and ambition, and apparently has lower regard for regulations (esp. foreign, i.e., U.S. ones) and other rules of the game.

We are not saying the gross problems of either company infect the U.S. nuclear industry.  But the potential is there.  The industry has experienced events that suggest the presence of human, technical and systemic shortcomings.  For a general illustration of inadequate management effectiveness, look at Entergy’s series of SC problems.  For a specific case, remember Davis-Besse, where favoring production over safety took the plant to the brink of a significant failure.  Caveat nuclear.


*  See, for example: J. Ewing and G. Bowley, “The Engineering of Volkswagen’s Aggressive Ambition,” New York Times (Dec. 13, 2015).  J. Ewing, “Volkswagen Terms One Emissions Problem Smaller Than Expected,” New York Times (Dec. 9, 2015).

Tuesday, November 17, 2015

Foolproof by Greg Ip: Insights for the Nuclear Industry

This book* is primarily about systemic lessons learned from the 2008 U.S. financial crisis and, to a lesser extent, various European euro crises. Some of the author’s observations also apply to the nuclear industry.

Ip’s overarching thesis is that steps intended to protect a system, e.g., a national or global financial system, may over time lead to over-confidence, increased risk-taking and eventual instability.  Stability breeds complacency.**  As we know, a well-functioning system creates a series of successful outcomes, a line of dynamic non-events.  But that dynamic includes gradual changes to the system, e.g., innovation or adaptation to the environment, that may increase systemic risk and result in a new crisis or unintended consequences

He sees examples that evidence his thesis in other fields.  For automobiles, the implementation of anti-lock braking systems leads some operators to drive more recklessly.  In football, better helmets mean increased use of the head as a weapon and more concussions and spinal injuries.  For forest fires, a century of fire suppression has led to massive fuel build-ups and more people moving into forested areas.  For flood control, building more and higher levees has led to increased economic development in historically flood-prone areas.  As a result, both fires and floods can have huge financial losses when they eventually occur.  In all cases, well-intentioned system “improvements” lead to increased confidence (aka loss of fear) and risk-taking, both obvious and implicit.  In short, “If the surroundings seem safer, the systems tolerate more risk.” (p. 18)

Ip uses the nuclear industry to illustrate how society can create larger issues elsewhere in a system when it effects local responses to a perceived problem.  Closing down nuclear plants after an accident (e.g., Fukushima) or because of green politics does not remove the demand for electric energy.  To the extent the demand shortfall is made up with hydrocarbons, additional people will suffer from doing the mining, drilling, processing, etc. and the climate will be made worse.

He cites the aviation industry as an example of a system where near-misses are documented and widely shared in an effort to improve overall system safety.  He notes that the few fatal accidents that occur in commercial aviation serve both as lessons learned and keep those responsible for operating the system (pilots and controllers) on their toes.

He also makes an observation about aviation that could be applied to the nuclear industry: “It is almost impossible to improve a system that never has an accident. . . . regulators are unlikely to know whether anything they propose now will have provable benefits; it also means that accidents will increasingly be of the truly mysterious, unimaginable variety . . .” (p. 252)

Speaking of finance, Ip says “A huge part of what the financial system does is try to create the fact—and at times the illusion—of safety.  Usually, it succeeds; . . . On those rare occasions when it fails, the result is panic.” (p. 86)  Could this description also apply to the nuclear industry? 

Our Perspective

Ip’s search for systemic, dynamic factors to explain the financial crisis echoes the type of analysis we’ve been promoting for years.  Like us, he recognizes that people hold different world views of the same system.  Ip contrasts the engineers and the ecologists:  “Engineers satisfy our desire for control, . . . civilization’s needs to act, to do something, . . .” (p. 278)  Ecologists believe “it’s the nature of risk to find the vulnerabilities we missed, to hit when least expected, to exploit the very trust in safety we so assiduously cultivate with all our protection . . .” (p. 279)

Ip’s treatment of the nuclear industry, while positive, is incomplete and somewhat simplistic.  It’s really just an example, not an industry analysis.  His argument that shutting down nuclear plants exacerbates climate harm could have come from the NEI playbook.  He ignores the impact of renewables, efficiency and conservation.

He doesn’t discuss the nuclear industry’s penchant for secrecy, but we have and believe it feeds the public’s uncertainty about the industry's safety.  As Ip notes, “People who crave certainty cannot tolerate even a slight increase in uncertainty, and so they flee not just the bad banks, the bad paper, and the bad country, but everything that resembles them, . . .” (p. 261)  If a system that is assumed [or promoted] to be safe has a crisis, even a local one, the result is often panic. (p. 62)

He mentions high reliability organizations (HROs) focusing on their avoiding catastrophe and “being a little bit scared all of the time.” (p. 242)  He does not mention that some of the same systemic factors of the financial system are at work in the world of HROs, including exposure to the corrosive effects of complacency and system drift. (p. 242)

Bottom line: Read Foolproof if you have an interest in an intelligible assessment of the financial crisis.  And remember: “Fear serves a purpose: it keeps us out of trouble.” (p. 19)  “. . . but it can keep us from taking risks that could make us better off.” (p. 159)


*  G. Ip, Foolproof (New York: Little, Brown, 2015).  Ip is a finance and economics journalist, currently with the Wall Street Journal and previously with The Economist.

**  He quotes a great quip from Larry Summers: “Complacency is a self-denying prophecy.”  Ip adds, “If everyone worried about complacency, no one would succumb to it.” (p.263)

Monday, November 2, 2015

Cultural Tidbits from McKinsey

We spent a little time poking around the McKinsey* website looking for items that could be related to safety culture and found a couple.  They do not provide any major insights but they do spur us to think of some questions for you to ponder about your own organization.

One article discussed organizational redesign** and provided a list of recommended rules, including establishing metrics that show if success is being achieved.  Following is one such metric.

“One utility business decided that the key metric for its efficiency-driven redesign was the cost of management labor as a proportion of total expenditures on labor.  Early on, the company realized that the root cause of its slow decision-making culture and high cost structure had been the combination of excessive management layers and small spans of control.  Reviewing the measurement across business units and at the enterprise level became a key agenda item at monthly leadership meetings.” (p. 107)

What percent of total labor dollars does your organization spend on “management”?  Could your organization’s decision making be speeded up without sacrificing quality or safety?  Would your organization rather have the “right” decision (even if it takes a long time to develop) or no decision at all rather than risk announcing a “wrong” one?

A second article discussed management actions to create a longer view among employees,*** including clearly identifying and prioritizing organizational values.  Following is an example of action related to values.

“The pilots of one Middle East–based airline frequently write incident reports that candidly raise concerns, questions, and observations about potential hazards.  The reports are anonymous and circulate internally, so that pilots can learn from one another and improve—say, in handling a particularly tricky approach at an airport or dealing with a safety procedure.  The resulting conversations reinforce the safety culture of this airline and the high value it places on collaboration.  Moreover, by making sure that the reporting structures aren’t punitive, the airline’s executives get better information and can focus their attention where it’s most needed.”

How do your operators and other professionals share experiences and learning opportunities among themselves at your site?  How about throughout your fleet?  Does documenting anything that might be construed as weakness require management review or approval?  Is management (or the overall organization) so fearful of such information being seen by regulators or the public, or discovered by lawyers, that the information is effectively suppressed?  Is your organization paranoid or just applying good business sense?  Do you have a culture that would pass muster as “just”?

Our Perspective


Useful nuggets on management or culture are where you find them.  Others’ experiences can stimulate questions; the answers can help you better understand local organizational phenomena, align your efforts with the company’s needs and build your professional career.


*  McKinsey & Company is a worldwide management consulting firm.


**  S. Aronowitz et al, “Getting organizational redesign right,” McKinsey Quarterly, no. 3 (2015), pp. 99-109.

***  T. Gibbs et al, “Encouraging your people to take the long view,” McKinsey Quarterly (Sept. 2012).