Friday, August 28, 2009
Professor Wilpert emphasized the interaction of human, technology, and organizational dynamics. His tools for human factors event analysis have become the standard practice in German and Swiss nuclear plants. He is the author of several leading books including Safety Culture in Nuclear Power Operations; System Safety: Challenges; Pitfalls of Intervention; Emerging Demands for Nuclear Safety of Nuclear Power Operations: Challenge and Response; and Nuclear Safety: A Human Factors Perspective.
Professor Wilpert was also a principal contributor to the LearnSafe project conducted in Europe from 2001 – 2004. See the following link for information about the project team and its results and look to us for future posts on the LearnSafe research.
Link to LearnSafe project.
Wednesday, August 26, 2009
First, what can or should one conclude about the overall state of safety culture in this organization given these results? One wonders if these results were shown to a number of experts, whether their interpretations would be consistent or whether they would even purport to associate the results with a finding. As discussed in a prior post, this issue is fundamental to the nature of safety culture, whether it is amenable to direct measurement, and whether assessment results really say anything about the safety health of the organization.
But the more particular question for this post is whether an assessment can detect complacency in an organization and its potential for latent risk to the organization’s safety performance. In a post dated July 30, 2009 I referred to the problems presented by complacency, particularly in organizations experiencing few operational challenges. That environment can be ripe for a weak culture to develop or be sustained. Could that environment also bias the responses to assessment questions, reinforcing the incorrect perception that safety culture is healthy? It may be that this type of situation is of most relevance in today’s nuclear industry where the vast majority of plants are operating at high capacity factors and experiencing few significant operational events. It is not clear to this commentator that assessments can be designed to explicitly detect complacency, and even the use of assessment results in conjunction with other data (data likely to look normal when overall performance is good) may not be credible in raising an alarm.
Link to NEI presentation.
Monday, August 24, 2009
To illustrate some of the issues I will use an NEI presentation made to the NRC on February 3, 2009. On Slide 2 there is a statement that the USA methodology (for safety culture surveys and assessments) has been used successfully for five years. One question is what does it mean that an assessment was successful? The intent is not to pick on this particular methodology but to open the question of exactly what is the expected result of performing an assessment.
It may be that “successful” means that the organizations being assessed have found the process and results to be useful or interesting, e.g., by stimulating discussion or furthering exploration of issues associated with the results. There are many, myself included, who believe anything that stimulates an organization to discuss and contemplate safety management issues is beneficial. On the other hand it may be that organizations (and regulators??) believe assessments are successful because they can use the results to make a determination that a safety culture is “acceptable” or “strong” or “needs improvement”. Can assessments really carry the weight of this expectation? Or is a rose just a rose?
Slide 11 highlights these questions by indicating a validation of the assessment methodology is to be carried out. “Validation” seems to suggest that assessments mean something beyond their immediate results. It may also suggest that assessment results can be compared to some “known” value to determine whether the assessment accurately measured or predicted that value. We will have to wait and see what is intended and how the validation is performed. At the same time we will be keeping in mind the observation of Professor Wilpert in my post of August 17, 2009 that “culture is not a quantifiable phenomenon”.
Link to presentation.
Monday, August 17, 2009
One question that frequently comes to mind is, can safety culture be separated from the manifestation of culture in terms of the specific actions and decisions taken by an organization? For example, if an organization makes some decisions that are clearly at odds with “safety being the overriding priority”, can the culture of the organization not be deficient? But if an assessment of the culture is performed, and the espoused beliefs and priorities are generally supportive of safety, what is to be made of those responses?
The reference material for this post comes from some work led by the late Bernhard Wilpert of the Berlin University of Technology. (We will sample a variety of his work in the safety management area in future posts.) It is a brief slide presentation titled, “Challenges and Opportunities of Assessing Safety Culture”. Slide 3 for example revisits E. H. Schein’s multi-dimensional formulation of safety culture which suggests that assessments must be able to expose all levels of culture and their integrated effect.
Two observations from these slides seem of particular note. They are both under Item 4, Methodological Challenges. The first observation is that culture is not a quantifiable phenomenon and does not lend itself easily to benchmarking. This bears consideration as most assessment methods being used today employ some statistical comparisons to assessments at other plants, including percentile type ranking. The other observation in the slide is that culture results from the learning experience of its members. This is of particular interest to us as it supports some of the thinking associated with a systems dynamics approach. A systems view involves the development of shared “mental models” of how safety management “works”; the goal being that individual actions and decisions can be understood within a commonly understood framework. The systems process becomes, in essence, the mechanism for translating beliefs into actions.
Link to slide presentation.
Thursday, August 13, 2009
System Dynamics is a concept for seeing the world in terms of inputs and outputs, where internal feedback loops and time delays can affect system behavior and lead to complex, non-linear changes in system performance.
The System Dynamics worldview was originally developed by Prof. Jay Forrester at MIT. Later work by other thinkers, e.g., Peter Senge, author of The Fifth Discipline, expanded the original concepts and made them available to a broader audience. An overview of System Dynamics can be found on Wikipedia.
Our NuclearSafetySim program uses System Dynamics to model managerial behavior in an environment where maintaining the nuclear safety culture is a critical element. NuclearSafetySim is built using isee Systems iThink software. isee Systems has educational materials available on their website that explain some basic concepts.
Thursday, August 6, 2009
The figure below illustrates how the number of problems/issues (we use the generic term "challenges" in NuclearSafetySim) might vary with time when the response is reactive. The blue line indicates the total number of issues, the pink line the number of new issues being identified and the green line, the resolution rate for issues, e.g., through a corrective action program. Note that the blue line initially increases and then oscillates while the pink line is relatively constant. The oscillation derives from the management response, reflected in the green line, where there is an initial delay in responding to an increased numbers of issues, then resolution rates are greatly increased to address higher backlogs, then reduced (due to budgetary pressures and other priorities) when backlogs start to fall, precipitating another cycle of increasing issues.
Compare the oscillatory response above to the next figure where an increase in issues results immediately in higher resolution rates that are maintained over a period sufficient to return the system to a lower level of backlogs. In parallel, budgets are increased to address the underlying causes of issues, driving down the occurrence rate of new issues and ultimately bringing backlog down to a long-term sustainable level.
The last figure shows some of the ramifications of system management on safety culture and employee trust. The significant increase in issues backlog initially leads to a degradation of employee trust (the pink line) and an erosion in safety culture (blue line). However the nature and effectiveness of the management response in bringing down backlogs and reducing new issues reverses the trust trend line and rebuilds safety culture over time. Note the red line, representing plant performance, is relatively unchanged over the same period indicating that performance issues may exist under the cover of a consistently operating plant.
Tuesday, August 4, 2009
The Economist has occasional articles on the practical applications of computer simulation. Following are a couple of items that have appeared in the last year.
Agent-based simulation is used to model the behavior of crowds. "Agent-based" means that each individual has some capacity to ascertain what is going on in the environment and act accordingly. This approach is being used to simulate the movement of people in a railroad station or during a building fire. On a much larger scale, each of the computer-generated orcs in the "Lord of the Rings" battle scenes moved independently based on his immediate surroundings.
Link to article.
The second article is a brief review of simulation's use in business applications, including large-scale systems (e.g., an airline), financial planning, forecasting, process mapping and Monte Carlo analysis. This is a quick read on the ways simulation is used to illustrate and analyze a variety of complex situations.
Link to article.
Other informational resources that discuss simulation are included on our References page.
Monday, August 3, 2009
Question for nuclear professionals: Does your organization maintain a library of resources such as Just Culture or Dianne Vaughan’s book, The Challenger Launch Decision, that provide deep insights into organizational performance and culture? Are materials like this routinely the subject of discussions in training sessions and topical meetings?