Monday, August 17, 2009
Safety Culture Assessment
One question that frequently comes to mind is, can safety culture be separated from the manifestation of culture in terms of the specific actions and decisions taken by an organization? For example, if an organization makes some decisions that are clearly at odds with “safety being the overriding priority”, can the culture of the organization not be deficient? But if an assessment of the culture is performed, and the espoused beliefs and priorities are generally supportive of safety, what is to be made of those responses?
The reference material for this post comes from some work led by the late Bernhard Wilpert of the Berlin University of Technology. (We will sample a variety of his work in the safety management area in future posts.) It is a brief slide presentation titled, “Challenges and Opportunities of Assessing Safety Culture”. Slide 3 for example revisits E. H. Schein’s multi-dimensional formulation of safety culture which suggests that assessments must be able to expose all levels of culture and their integrated effect.
Two observations from these slides seem of particular note. They are both under Item 4, Methodological Challenges. The first observation is that culture is not a quantifiable phenomenon and does not lend itself easily to benchmarking. This bears consideration as most assessment methods being used today employ some statistical comparisons to assessments at other plants, including percentile type ranking. The other observation in the slide is that culture results from the learning experience of its members. This is of particular interest to us as it supports some of the thinking associated with a systems dynamics approach. A systems view involves the development of shared “mental models” of how safety management “works”; the goal being that individual actions and decisions can be understood within a commonly understood framework. The systems process becomes, in essence, the mechanism for translating beliefs into actions.
Link to slide presentation.
Thursday, August 13, 2009
Primer on System Dynamics
System Dynamics is a concept for seeing the world in terms of inputs and outputs, where internal feedback loops and time delays can affect system behavior and lead to complex, non-linear changes in system performance.
The System Dynamics worldview was originally developed by Prof. Jay Forrester at MIT. Later work by other thinkers, e.g., Peter Senge, author of The Fifth Discipline, expanded the original concepts and made them available to a broader audience. An overview of System Dynamics can be found on Wikipedia.
Our NuclearSafetySim program uses System Dynamics to model managerial behavior in an environment where maintaining the nuclear safety culture is a critical element. NuclearSafetySim is built using isee Systems iThink software. isee Systems has educational materials available on their website that explain some basic concepts.
Thursday, August 6, 2009
Signs of a Reactive Organization (MIT #6)
The figure below illustrates how the number of problems/issues (we use the generic term "challenges" in NuclearSafetySim) might vary with time when the response is reactive. The blue line indicates the total number of issues, the pink line the number of new issues being identified and the green line, the resolution rate for issues, e.g., through a corrective action program. Note that the blue line initially increases and then oscillates while the pink line is relatively constant. The oscillation derives from the management response, reflected in the green line, where there is an initial delay in responding to an increased numbers of issues, then resolution rates are greatly increased to address higher backlogs, then reduced (due to budgetary pressures and other priorities) when backlogs start to fall, precipitating another cycle of increasing issues.
Compare the oscillatory response above to the next figure where an increase in issues results immediately in higher resolution rates that are maintained over a period sufficient to return the system to a lower level of backlogs. In parallel, budgets are increased to address the underlying causes of issues, driving down the occurrence rate of new issues and ultimately bringing backlog down to a long-term sustainable level.
The last figure shows some of the ramifications of system management on safety culture and employee trust. The significant increase in issues backlog initially leads to a degradation of employee trust (the pink line) and an erosion in safety culture (blue line). However the nature and effectiveness of the management response in bringing down backlogs and reducing new issues reverses the trust trend line and rebuilds safety culture over time. Note the red line, representing plant performance, is relatively unchanged over the same period indicating that performance issues may exist under the cover of a consistently operating plant.
Tuesday, August 4, 2009
The Economist on Computer Simulation
The Economist has occasional articles on the practical applications of computer simulation. Following are a couple of items that have appeared in the last year.
Agent-based simulation is used to model the behavior of crowds. "Agent-based" means that each individual has some capacity to ascertain what is going on in the environment and act accordingly. This approach is being used to simulate the movement of people in a railroad station or during a building fire. On a much larger scale, each of the computer-generated orcs in the "Lord of the Rings" battle scenes moved independently based on his immediate surroundings.
Link to article.
The second article is a brief review of simulation's use in business applications, including large-scale systems (e.g., an airline), financial planning, forecasting, process mapping and Monte Carlo analysis. This is a quick read on the ways simulation is used to illustrate and analyze a variety of complex situations.
Link to article.
Other informational resources that discuss simulation are included on our References page.
Monday, August 3, 2009
Reading List: Just Culture by Sidney Dekker
Question for nuclear professionals: Does your organization maintain a library of resources such as Just Culture or Dianne Vaughan’s book, The Challenger Launch Decision, that provide deep insights into organizational performance and culture? Are materials like this routinely the subject of discussions in training sessions and topical meetings?
Thursday, July 30, 2009
“Reliability is a Dynamic Non-Event” (MIT #5)
What does this imply about the nuclear industry? Certainly we are in a period where the reliability of the plants is at a very high level and the NRC ROP indicator board is very green. Is this positive for maintaining high safety culture levels or does it represent a potential threat? It could be the latter since the biggest problem in addressing the safety implications of complacency in an organization is, well, complacency.
Wednesday, July 29, 2009
Self Preservation (MIT #4)
My sense is that the self preservation effect is one that exists deeply embedded within the larger safety climate of the organization. In that climate how strictly is rule adherence observed? Are procedures and processes of sufficient quality to enhance observance? If procedures and processes are ambiguous or even incorrect, and left uncorrected, is there a tacit approval of alternate methods? The reality is self preservation can act in several directions – it may impel compliance, if that is truly the organizational ethic, or it could rationalize non-compliance if that is an organizational expectation. Life is difficult.