Showing posts with label System Dynamics. Show all posts
Showing posts with label System Dynamics. Show all posts

Friday, October 6, 2023

A Straightforward Recipe for Changing Culture

Center for Open Science
Source: COS website

We recently came across a clear, easily communicated road map for implementing cultural change.*  We’ll provide some background information on the author’s motivation for developing the road map, a summary of it, and our perspective on it.

The author, Brian Nosek, is executive director of the Center for Open Science (COS).  The mission of COS is to increase the openness, integrity, and reproducibility of scientific research.  Specifically, they propose that researchers publish the initial description of their studies so that original plans can be compared with actual results.  In addition, researchers should “share the materials, protocols, and data that they produced in the research so that others could confirm, challenge, extend, or reuse the work.”  Overall, the COS proposes a major change from how much research is presently conducted.

Currently, a lot of research is done in private, i.e., more or less in secret, usually with the objective of getting results published, preferably in a prestigious journal.  Frequent publishing is fundamental to getting and keeping a job, being promoted, and obtaining future funding for more research, in other words, having a successful career.  Researchers know that publishers generally prefer findings that are novel, positive (e.g., a treatment is effective), and tidy (the evidence fits together).

Getting from the present to the future requires a significant change in the culture of scientific research.  Nosek describes the steps to implement such change using a pyramid, shown below, as his visual model.  Similar to Abraham Maslow’s Hierarchy of Needs, a higher level of the pyramid can only be achieved if the lower levels are adequately satisfied.

Source: "Strategy for Culture Change"

Each level represents a different step for changing a culture:

•    Infrastructure refers to an open source database where researchers can register their projects, share their data, and show their work.
•    The User Interface of the infrastructure must be easy to use and compatible with researchers' existing workflows.
•    New research Communities will be built around new norms (e.g., openness and sharing) and behavior, supported and publicized by the infrastructure.
•    Incentives refer to redesigned reward and recognition systems (e.g., research funding and prizes, and institutional hiring and promotion schemes) that motivate desired behaviors.
•    Public and private Policy changes codify and normalize the new system, i.e., specify the new requirements for conducting research.
Our Perspective

As long-time consultants to senior managers, we applaud Nosek’s change model.  It is straightforward and adequately complete, and can be easily visualized.  We used to spend a lot of time distilling complicated situations into simple graphics that communicated strategically important points.

We also totally support his call to change the reward system to motivate the new, desirable behaviors.  We have been promoting this viewpoint for years with respect to safety culture: If an organization or other entity values safety and wants safe activities and outcomes, then they should compensate the senior leadership accordingly, i.e., pay for safety performance, and stop promoting the nonsense that safety is intrinsic to the entity’s functioning and leaders should provide it basically for free.

All that said, implementing major cultural change is not as simple as Nosek makes it sound.

First off, the status quo can have enormous sticking power.  Nosek acknowledges it is defined by strong norms, incentives, and policies.  Participants know the rules and how the system works, in particular they know what they must do to obtain the rewards and recognition.  Open research is an anathema to many researchers and their sponsors; this is especially true when a project is aimed at creating some kind of competitive advantage for the researcher or the institution.  Secrecy is also valued when researchers may (or do) come up with the “wrong answer” – findings that show a product is not effective or has dangerous side effects, or an entire industry’s functioning is hazardous for society.

Second, the research industry exists in a larger environment of social, political and legal factors.  Many elected officials, corporate and non-profit bosses, and other thought leaders may say they want and value a world of open research but in private, and in their actions, believe they are better served (and supported) by the existing regime.  The legal system in particular is set up to reinforce the current way of doing business, e.g., through patents.

Finally, systemic change means fiddling with the system dynamics, the physical and information flows, inter-component interfaces, and feedback loops that create system outcomes.  To the extent such outcomes are emergent properties, they are created by the functioning of the system itself and cannot be predicted by examining or adjusting separate system components.  Large-scale system change can be a minefield of unexpected or unintended consequences.

Bottom line: A clear model for change is essential but system redesigners need to tread carefully.  

*  B. Nosek, “Strategy for Culture Change,” blog post (June 11th, 2019).

Wednesday, February 2, 2022

A Massive Mental Model: Lessons from Principles for Dealing with the Changing World Order by Ray Dalio

At Safetymatters, we have emphasized several themes over the years, including the importance of developing complete and realistic mental models of systems, often large, complicated, socio-technical organizations, to facilitate their analysis.  A mental model includes the significant factors that comprise the system, their interrelationships, system dynamics (how the system functions over time), and system outputs and their associated metrics.

This post outlines an ambitious and grand mental model: the recurring historical arc exhibited by all the world’s great empires as described in Ray Dalio’s new book.* Dalio examined empires from ancient China through the 20th century United States.  He identified 18 factors that establish and demonstrate a great society’s rise and fall: 3 “Big Cycles,” 8 different types of power an empire can exhibit, and 7 other determinants.

Three Big Cycles 

The big cycles have a natural progression and are influenced by human innovation, technological development, and acts of nature.  They occur over an empire’s 250 year lifetime of emergence, rise, topping out, decline, and replacement by a new dominant power.

The financial cycle initially supports prosperity but debt builds over time, then governments accommodate it by printing more money** which eventually leads to a currency devaluation, debt restructuring (including defaults), and the cycle starts over.  These cycles typically last about 50 to 100 years so can occur repeatedly over an empire’s lifetime.

The political cycle starts with a new order and leadership, then resource allocation systems are built, productivity and prosperity grow, but lead to excessive spending and widening wealth gaps, then bad financial conditions (e.g., depressions), civil war or revolution, and the cycle starts over.

The international cycle is dominated by raw power dynamics.  Empires build power and, over time, have conflicts with other countries over trade, technology, geopolitics, and finances.  Some conflicts lead to wars.  Eventually, the competition becomes too costly, the empire weakens, and the cycle starts over.

Dimensions and measures of power

An empire can develop and exercise power in many ways; these are manifestations and measures of the empire’s competitive advantages relative to other countries.  The 8 areas are education, cost competitiveness, innovation and technology, economic output, share of world trade, military strength, financial center strength, and reserve currency status.

Other determinants

These include natural attributes and events, internal financial/political/legal practices, and measures of social success and satisfaction.  Specific dimensions are geology, resource allocation efficiency, acts of nature, infrastructure and investment, character/civility/determination, governance/rule of law, gaps in wealth, opportunity and cultural values.

The 18 factors interact with each other, typically positively reinforcing each other, with some leading others, e.g., a society must establish a strong education base to support innovation and technology development.  Existing conditions and determinants propel changes that create new conditions and determinants.

System dynamics

Evolution is the macro driving force that creates the system dynamic over time.  In Dalio’s view “Evolution is the biggest and only permanent force in the universe . . .” (p. 27)  He also considers other factors that shape an empire’s performance.  The most important of these are self-interest, the drive for wealth and power, the ability to learn from history, multi-generational differences, time frames for decision making, and human inventiveness.  Others include culture, leadership competence, and class relationships.  Each of these factors can wax and/or wane over the course of an empire’s lifetime, leading to changes in system performance.

Dalio uses his model to describe (and share) his version of the economic-political history of the world, and the never-ending struggles of civilizations over the accumulation and distribution of wealth and power.  Importantly, he also uses it to inform his worldwide investment strategies.  His archetype models are converted into algorithms to monitor conditions and inform investment decisions.  He believes all financial markets are driven by growth, inflation, risk premiums (e.g., to compensate for the risk of devaluation), and discount rates.

Our Perspective

Dalio’s model is ambitious, extensive, and complicated.  We offer it up an extreme example of mental modeling, i.e., identifying all the important factors in a system of interest and defining how they work together to produce something.  Your scope of interest may be more limited – a power plant, a hospital, a major corporation – but the concept is the same.

Dalio is the billionaire founder of hedge fund Bridgewater Associates.  He has no shortage of ego or self-confidence.  He name-drops prominent politicians and thinkers from around the world to add weight to his beliefs.  We reviewed his 2017 book Principles on April 17, 2018 to show an example of a hard-nosed, high performance business culture. 

He is basically a deterministic thinker who views the world as a large, complex machine.  His modeling emphasizes cause-effect relationships that evolve and repeat over time.  He believes a perfect model would perfectly forecast the future so we assume he views the probabilistic events that occur at network branching nodes as consequences of an incomplete, i.e., imperfect model.  In contrast, we believe that some paths are created by events that are essentially probabilistic (e.g., “surprising acts of nature”) or the result of human choices.  We agree that human adaptation, learning, and inventiveness are keys to productivity improvements and social progress, but we don’t think they can be completely described in mechanical cause-effect terms.  Some system conditions are emergent, i.e., the consequence of a system’s functioning, and other things occur simply by chance. 

This book is over 500 pages, full of data and tables.  Individual chapters detail the history of the Dutch, British, American, and Chinese empires over the last 500 years.  The book has no index so referring back to specific topics is challenging. Dalio is not a scholar and gives scant or no credit to thinkers who used some of the same archetypes long before him.

We offer no opinion on the accuracy or completeness of Dalio’s version of world history, or his prognostications about the future, especially U.S.-China relations.

Bottom line: this is an extensive model of world history, full of data; the analyses of the U.S. and China*** are worth reading.


*  R. Dalio, Principles for Dealing with the Changing World Order (New York: Avid Reader Press) 2021.

**  If the new money and credit goes into economic productivity, it can be good for the society.  But the new supply of money can also cheapen it, i.e., drive its value down, reducing the desire of people to hold it and pushing up asset prices.

***  Dalio summarizes the Chinese political-financial model as “Confucian values with capitalist practices . . .” (p. 364)

Tuesday, February 2, 2021

Organizational Change and System Dynamics Insights from The Tipping Point by Malcolm Gladwell

The Tipping Point*
is a 2002 book by Malcolm Gladwell (who also wrote Blink) that uses the metaphor of a viral epidemic to explain how some phenomenon, e.g., a product**, an idea, or a social norm, can suddenly reach a critical mass and propagate rapidly through society.  Following is a summary of his key concepts.  Some of his ideas can inform strategies for implementing organizational change, especially cultural change, and reflect attributes of system dynamics that we have promoted on Safetymatters.

In brief, epidemics spread when they have the right sort of people to transmit the infectious agent, the agent itself has an attribute of stickiness, and the environment supports the agent and facilitates transmission. 


An epidemic thrives on three different types of people: people who connect with lots of other people, people who learn about a new product or idea and are driven to tell others, and persuasive people who sell the idea to others.  All these messengers drive contagiousness although all three types are not required for every kind of epidemic.


A virus needs to attach itself to a host; a new product promotion needs to be memorable, i.e., stick in people’s minds and spur them to action, for example Wendy’s “Where’s the beef?” campaign or the old “Winston tastes good . . .” jingle.  Information about the new product or idea needs to be packaged in a way that makes it attractive and difficult to resist.


General and specific environmental characteristics can encourage or discourage the spread of a phenomenon.  For a general example in the social environment consider the Broken Windows theory which holds that intolerance of the smallest infractions can lead to overall reductions in crime rates.

At the more specific level, humans possess a set of tendencies that can be affected by the particular circumstances of their immediate environment.  For example, we are more likely to comply with someone in a uniform (a doctor, say, or a police officer) than a scruffy person in jeans.  If people believe there are many witnesses to a crime, it’s less likely that anyone will try to stop or report the criminal activity; individual responsibility is diffused to the point of inaction.      

Our Perspective

We will expand some of Gladwell’s notions to emphasize how they can be applied to create organizational changes, including cultural change.  In addition, we’ll discuss how the dynamics he describes square with some aspects of system dynamics we have promoted on Safetymatters.

Organizational change

Small close-knit groups have the potential to magnify the epidemic potential of a message or idea.  “Close-knit” means people know each other well and even store information with specific individuals (the subject matter experts) to create a kind of overall group memory.  These bonds of memory and peer pressure can facilitate the movement of new ideas into and around the group, affecting the group’s shared mental models of everything from the immediate task environment to the larger outside world.  Many small movements can create larger movements that manifest as new or modified group norms.

In a product market, diffusion moves from innovators to early adopters to the majority and finally the laggards.  A similar model of diffusion can be applied in a formal organization.  Organizational managers trying to implement cultural changes should consider this diffusion model when they are deciding who to appoint to initiate, promote, and promulgate new or different cultural values or behaviors.  Ideally, they should start with well-connected, respected people who buy into the new attributes, can explain them to others, and influence others to try the new behaviors.

System dynamics

This whole book is about how intrusions can disrupt an existing social system, for good or bad, and result in epidemic, i.e., nonlinear effects.  This nonlinearity helps explain why systems can be operating more or less normally then suddenly veer into failure.  Active management deliberately tries to create such changes to veer into success.  Just think about how social media has upset the loci of power in our society: elected leaders and experts now have larger megaphones but so does the mob. 

That said, Gladwell presents a linear, cause-and-effect model for change.  He does not consider more complex system features such as feedback loops or deliberate attempts to modify, deflect, co-opt or counteract the novel input.  For example, a manager can try to establish new behaviors by creating a reinforcing loop of rewards and recognition in a small group, and then recreating it on an ever-larger scale.

Bottom line: This is easy reading with lots of interesting case studies and quotes from talking head PhDs.  The book comes across as a long magazine article. 


*  M Gladwell, The Tipping Point (New York: Back Bay Books/Little, Brown and Co.) 2000 and 2002.

**  “Product” is used in its broadest sense; it can mean something physical like a washing machine, a political campaign, a celebrity wannabe, etc.

Monday, December 14, 2020

Implications of Randomness: Lessons from Nassim Taleb

Most of us know Nassim Nicholas Taleb from his bestseller The Black Swan. However, he wrote an earlier book, Fooled by Randomness*, in which he laid out one of his seminal propositions: a lot of things in life that we believe have identifiable, deterministic causes such as prescient decision making or exceptional skills, are actually the result of more random processes. Taleb focuses on financial markets but we believe his observations can refine our thinking about organizational decision making, mental models, and culture.

We'll begin with an example of how Taleb believes we misperceive reality. Consider a group of stockbrokers with successful 5-year track records. Most of us will assume they must be unusually skilled. However, we fail to consider how many other people started out as stockbrokers 5 years ago and fell by the wayside because of poor performance. Even if all the stockbrokers were less skilled than a simple coin flipper, some would still be successful over a 5 year period. The survivors are the result of an essentially random process and their track records mean very little going forward.

Taleb ascribes our failure to correctly see things (our inadequate mental models) to several biases. First is the hindsight bias where the past is always seen as deterministic and feeds our willingness to backfit theories or models to experience after it occurs. Causality can be very complex but we prefer to simplify it. Second, because of survivorship bias, we see and consider only the current survivors from an initial cohort; the losers do not show up in our assessment of the probability of success going forward. Our attribution bias tells us that successes are due to skills, and failures to randomness.

Taleb describes other factors that prevent us from being the rational thinkers postulated by classical economics or Cartesian philosophy. One set of factors arises from how are brains are hardwired and another set from the way we incorrectly process data presented to us.

The brain wiring issues include the work of Daniel Kahneman who describes how we use and rely on heuristics (mental shortcuts that we invoke automatically) to make day-to-day decisions. Thus, we make many decisions without really thinking or applying reason, and we are subject to other built-in biases, including our overconfidence in small samples and the role of emotions in driving our decisions. We reviewed Kahneman's work at length in our Dec. 18, 2013 post. Taleb notes that we also have a hard time recognizing and dealing with risk. Risk detection and risk avoidance are mediated in the emotional part of the brain, not the thinking part, so rational thinking has little to do with risk avoidance.

We also make errors when handling data in a more formal setting. For example, we ignore the mathematical truth that initial sample sizes matter greatly, much more than the sample size as a percentage of the overall population. We also ignore regression to the mean, which says that absent systemic changes, performance will eventually return to its average value. More perniciously, ignorant or unethical researchers will direct their computers to look for any significant relationship in a data set, a practice that can often produce a spurious relationship because all the individual tests have their own error rates. “Data snoops” will define some rule, then go looking for data that supports it. Why are researchers inclined to fudge their analyses? Because research with no significant result does not get published.

Our Perspective

We'll start with the obvious: Taleb has a large ego and is not shy about calling out people with whom he disagrees or does not respect. That said, his observations have useful implications for how we conceptualize the socio-technical systems in which we operate, i.e., our mental models, and present specific challenges for the culture of our organizations.

In our view, the three driving functions for any system's performance over time are determinism (cause and effect), choice (decision making), and probability. At heart, Taleb's world view is that the world functions more probabilistically than most people realize. A method he employs to illustrate alternative futures is Monte Carlo simulation, which we used to forecast nuclear power plant performance back in the 1990s. We wanted plant operators to see that certain low-probability events, i.e., Black Swans**, could occur in spite of the best efforts to eliminate them via plant design, improved equipment and procedures, and other means. Some unfortunate outcomes could occur because they were baked into the system from the get-go and eventually manifested. This is what Charles Perrow meant by “normal accidents” where normal system performance excursions go beyond system boundaries. For more on Perrow, see our Aug. 29,2013 post.

Of course, the probability distribution of system performance may not be stationary over time. In the most extreme case, when all system attributes change, it's called regime change. In addition, system performance may be nonlinear, where small inputs may lead to a disproportionate response, or poor performance can build slowly and suddenly cascade into failure. For some systems, no matter how specifically they are described, there will inherently be some possibility of errors, e.g., consider healthcare tests and diagnoses where both false positives and false negatives can be non-trivial occurrences.

What does this mean for organizational culture? For starters, the organization must acknowledge that many of its members are inherently somewhat irrational. It can try to force greater rationality on its members through policies, procedures, and practices, instilled by training and enforced by supervision, but there will always be leaks. A better approach would be to develop defense in depth designs, error-tolerant sub-systems with error correction capabilities, and a “just culture” that recognizes that honest mistakes will occur.

Bottom line: You should think awhile about how many aspects of your work environment have probabilistic attributes.


* N.N. Taleb, Fooled by Randomness, 2nd ed. (New York: Random House) 2004.

** Black swans are not always bad. For example, an actor can have one breakthrough role that leads to fame and fortune; far more actors will always be waiting tables and parking cars.

Friday, January 6, 2017

Reflections on Nuclear Safety Culture for the New Year

The start of a new year is an opportunity to take stock of the current situation in the U.S. nuclear industry and reiterate what we believe with respect to nuclear safety culture (NSC).

For us, the big news at the end of 2016 was Entergy’s announcement that Palisades will be shutting down on Oct. 1, 2018.*  Palisades has been our poster child for a couple of things: (1) Entergy’s unwillingness or inability to keep its nose clean on NSC issues and (2) the NRC’s inscrutable decision making on when the plant’s NSC was either unsatisfactory or apparently “good enough.”

We will have to find someone else to pick on but don’t worry, there’s always some new issue popping up in NSC space.  Perhaps we will go to France and focus on the current AREVA and Électricité de France imbroglio which was cogently summarized in a Power magazine editorial: “At the heart of France’s nuclear crisis are two problems.  One concerns the carbon content of critical steel parts . . . manufactured or supplied by AREVA . . . The second problem concerns forged, falsified, or incomplete quality control reports about the critical components themselves.”**  Anytime the adjectives “forged” or “falsified” appear alongside nuclear records, the NSC police will soon be on the scene.  

Why do NSC issues keep arising in the nuclear industry?  If NSC is so important, why do organizations still fail to fix known problems or create new problems for themselves?  One possible answer is that such issues are the occasional result of the natural functioning of a low-tolerance, complex socio-technical system.  In other words, performance may drift out of bounds in the normal course of events.  We may not be able to predict where such issues will arise (although the missed warning signals will be obvious in retrospect) but we cannot reasonably expect they can be permanently eliminated from the system.  In this view, an NSC can be acceptably strong but not 100% effective.

If they are intellectually honest, this is the implicit mental model that most NSC practitioners and “experts” utilize even though they continue to espouse the dogma that more engineering, management, leadership, oversight, training and sanctions can and will create an actual NSC that matches some ideal NSC.  But we’ve known for years what an ideal NSC should look like, i.e., its attributes, and how responsibilities for creating and maintaining such a culture should be spread across a nuclear organization.***  And we’re still playing Whac-A-Mole.

At Safetymatters, we have promoted a systems view of NSC, a view that we believe provides a more nuanced and realistic view of how NSC actually works.  Where does NSC live in our nuclear socio-technical system?  Well, it doesn’t “live” anywhere.  NSC is, to some degree, an emergent property of the system, i.e., it is visible because of the ongoing functioning of other system components.  But that does not mean that NSC is only an effect or consequence.  NSC is both a consequence and a cause of system behavior.  NSC is a cause through the way it affects the processes that create hard artifacts, such as management decisions or the corrective action program (CAP), softer artifacts like the leadership exhibited throughout an organization, and squishy organizational attributes like the quality of hierarchical and interpersonal trust that permeates the organization like an ether or miasma. 

Interrelationships and feedback loops tie NSC to other organizational variables.  For example, if an organization fixes its problems, its NSC will appear stronger and the perception of a strong NSC will influence other organizational dynamics.  This particular feedback loop is generally reinforcing but it’s not some superpower, as can be seen in a couple of problems nuclear organizations may face: 

Why is a CAP ineffective?  The NSC establishes the boundaries between the desirable, acceptable, tolerable and unacceptable in terms of problem recognition, analysis and resolution.  But the strongest SC cannot compensate for inadequate resources from a plant owner, a systemic bias in favor of continued production****, a myopic focus on programmatic aspects (following the rules instead of searching for a true answer) or incompetence in plant staff. 

Why are plant records falsified?  An organization’s party line usually pledges that the staff will always be truthful with customers, regulators and each other.  The local culture, including its NSC, should reinforce that view.  But fear is always trying to slip in through the cracks—fear of angering the boss, fear of missing performance targets, fear of appearing weak or incompetent, or fear of endangering a plant’s future in an environment that includes the plant’s perceived enemies.  Fear can overcome even a strong NSC.

Our Perspective

NSC is real and complicated but it is not mysterious.  Most importantly, NSC is not some red herring that keeps us from seeing the true causes of underlying organizational performance problems.  Safetymatters will continue to offer you the information and insights you need to be more successful in your efforts to understand NSC and use it as a force for better performance in your organization.

Your organization will not increase its performance in the safety dimension if it continues to apply and reprocess the same thinking that the nuclear industry has been promoting for years.  NSC is not something that can be directly managed or even influenced independent of other organizational variables.  “Leadership” alone will not fix your organization’s problems.  You may protect your career by parroting the industry’s adages but you will not move the ball down the field without exercising some critical and independent thought.

We wish you a safe and prosperous 2017.

*  “Palisades Power Purchase Agreement to End Early,” Entergy press release (Dec. 8,2016).

**  L. Buchsbaum, “France’s Nuclear Storm: Many Power Plants Down Due to Quality Concerns,” Power (Dec. 1, 2016).  Retrieved Jan. 4, 2017.

***  For example, take a look back at INSAG-4 and NUREG-1756 (which we reviewed on May 26, 2015).

****  We can call that the Nuclear Production Culture (NPC).

Thursday, March 17, 2016

IAEA Nuclear Safety Culture Conference

The International Atomic Energy Agency (IAEA) recently sponsored a week-long conference* to celebrate 30 years of interest and work in safety culture (SC).  By our reckoning, there were about 75 individual presentations in plenary sessions and smaller groups; dialog sessions with presenters and subject matter experts; speeches and panels; and over 30 posters.  It must have been quite a circus.

We cannot justly summarize the entire conference in this space but we can highlight material related to SC factors we’ve emphasized or people we’ve discussed on Safetymatters, or interesting items that merit your consideration.

Topics We Care About

A Systems Viewpoint

Given that the IAEA has promoted a systemic approach to safety and it was a major conference topic it’s no surprise that many participants addressed it.  But we were still pleased to see over 30 presentations, posters and dialogues that included mention of systems, system dynamics, and systemic and/or holistic viewpoints or analyses.  Specific topics covered a broad range including complexity, coupling, Fukushima, the Interaction between Human, Technical and Organizational Factors (HTOF), error/incident analysis, regulator-licensee relationships, SC assessment, situational adaptability and system dynamics.

Role of Leadership

Leadership and Management for Safety was another major conference topic.  Leadership in a substantive context was mentioned in about 20 presentations and posters, usually as one of multiple success factors in creating and maintaining a strong SC.  Topics included leader/leadership commitment, skills, specific competences, attributes, obligations and responsibilities; leadership’s general importance, relationship to performance and role in accidents; and the importance of leadership in nuclear regulatory agencies. 

Decision Making

This was mentioned about 10 times, with multiple discussions of decisions made during the early stages of the Fukushima disaster.  Other presenters described how specific techniques, such as Probabilistic Risk Assessment and Human Reliability Analysis, or general approaches, such risk control and risk informed, can contribute to decision making, which was seen as an important component of SC.

Compensation and Rewards

We’ve always been clear: If SC and safety performance are important then people from top executives to individual workers should be rewarded (by which we mean paid money) for doing it well.  But, as usual, there was zero mention of compensation in the conference materials.  Rewards were mentioned a few times, mostly by regulators, but with no hint they were referring to monetary rewards.  Overall, a continuing disappointment.   

Participants Who Have Been Featured in Safetymatters

Over the years we have presented the work of many conference participants to Safetymatters readers.  Following are some familiar names that caught our eye.
  Page numbers refer to the conference “Programme and Abstracts” document.
We have to begin with Edgar Schein, the architect of the cultural construct used by almost everyone in the SC space.  His discussion paper (p. 47) argued that the SC components in a nuclear plant depend on whether the executives actually create the climate of trust and openness that the other attributes hinge on.  We’ve referred to Schein so often he has his own label on Safetymatters.

Mats Alvesson’s presentation
(p. 46) discussed “hyper culture,” the vague and idealistic terms executives often promote that look good in policy documents but seldom work well in practice.  This presentation is consistent with his article on Functional Stupidity which we reviewed on Feb. 23, 2016.

Sonja Haber’s paper (p. 55) outlined a road map for the nuclear community to move forward in the way it thinks about SC.  Dr. Haber has conducted many SC assessments for the Department of Energy that we have reviewed on Safetymatters. 

Ken Koves of INPO led or participated in three dialogue sessions.  He was a principal researcher in a project that correlated SC survey data with safety performance measures which we reviewed on Oct. 22, 2010 and Oct. 5, 2014.

Najmedin Meshkati discussed (p. 60) how organizations react when their control systems start to run behind environmental demands using Fukushima as an illustrative case.  His presentation draws on an article he coauthored comparing the cultures at TEPCO’s Fukushima Daiichi plant and Tohoku Electric’s Onagawa plant which we reviewed on Mar. 19, 2014.

Jean-Marie Rousseau co-authored a paper (p. 139) on the transfer of lesson learned from accidents in one industry to another industry.  We reviewed his paper on the effects of competitive pressures on nuclear safety management issues on May 8, 2013.

Carlo Rusconi discussed (p. 167) how the over-specialization of knowledge required by decision makers can result in pools of knowledge rather than a stream accessible to all members of an organization.  A systemic approach to training can address this issue.  We reviewed Rusconi’s earlier papers on training on June 26, 2013 and Jan. 9, 2014.

Richard Taylor’s presentation (p. 68) covered major event precursors and organizations’ failure to learn from previous events.  We reviewed his keynote address at a previous IAEA conference where he discussed using system dynamics to model organizational archetypes on July 31, 2012.

Madalina Tronea talked about (p. 114) the active oversight of nuclear plant SC by the National Commission for Nuclear Activities Control (CNCAN), the Romanian regulatory authority.  CNCAN has developed its own model of organizational culture and uses multiple methods to collect information for SC assessment.  We reviewed her initial evaluation guidelines on Mar. 23, 2012

Our Perspective

Many of the presentations were program descriptions or status reports related to the presenter’s employer, usually a utility or regulatory agency.  Fukushima was analyzed or mentioned in 40 different papers or posters.  Overall, there were relatively few efforts to promote new ideas, insights or information.  Having said that, following are some materials you should consider reviewing.

From the conference participants mentioned above, Haber’s abstract (p. 55) and Rusconi’s abstract (p. 167) are worth reading.  Taylor’s abstract (p. 68) and slides are also worth reviewing.  He advocates using system dynamics to analyze complicated issues like the effectiveness of organizational learning and how events can percolate through a supply chain.

Benoît Bernard described the Belgian regulator’s five years of experience assessing nuclear plant SC.  Note that lessons learned are described in his abstract (p. 113) but are somewhat buried in his presentation slides.

If you’re interested in a systems view of SC, check out Francisco de Lemos’ presentation
(p. 63) which gives a concise depiction of a complex system plus a Systems Theoretic Accident Models and Processes (STAMP) analysis.  His paper is based on Nancy Leveson’s work which we reviewed on Nov. 11, 2013.

Diana Engström argued that nuclear personnel can put more faith in reported numbers than justified by the underlying information, e.g., CAP trending data, and thus actually add risk to the overall system.  We’d call this practice an example of functional stupidity although she doesn’t use that term in her provocative paper.  Both her abstract (p. 126) and slides are worth reviewing.

Jean Paries gave a talk on the need for resilience in the management of nuclear operations.  The abstract (p. 228) is clear and concise; there is additional information in his slides but they are a bit messy.

And that’s it for this installment.  Be safe.  Please don’t drink and text.

*  International Atomic Energy Agency, International Conference on Human and Organizational Aspects of Assuring Nuclear Safety: Exploring 30 years of Safety Culture (Feb. 22–26, 2016).  This page shows the published conference materials.  Thanks to Madalina Tronea for publicizing them.  Dr. Tronea is the founder/moderator of the LinkedIn Nuclear Safety Culture discussion group.