Tuesday, February 23, 2016

The Dark Side of Culture Management: Functional Stupidity

Culture is the collection of values and assumptions that underlie organizational decisions and other actions.  We have long encouraged organizations to develop strong safety cultures (SC).  The methods available to do this are widely-known, including visible leadership and role models; safety-related policies, practices and procedures; supportive structures like an Employee Concerns Program; the reward and recognition system; training and oversight; and regulatory carrots and sticks.

Because safety performance alone does not pay the bills, organizations also need to achieve their intended economic goals (i.e., be effective) and operate efficiently.  Most of the methods that can be used to promote SC can also be used to promote the overall performance culture.

What happens when the organization goes too far in shaping its culture to optimize performance?  One possibility, according to a 2012 Journal of Management Studies article*, is a culture of Functional Stupidity.  The Functional part means the organization meets its goals and operates efficiently and Stupidity “is an organizationally supported inability or unwillingness to mobilize one’s cognitive capacities.” (p. 1199)**

More specifically, to the extent management, through its power and/or leadership, willfully shapes an organization’s value structure to achieve greater functionality (conformity, focus, efficiency, etc.) they may be, consciously or unconsciously, creating an environment where employees ask fewer questions (and no hard ones), seek fewer justifications for the organization’s decisions or actions, focus their intelligence in the organization’s defined areas, do not reflect on their roles in the organization’s undertakings, and essentially go along with the program.  Strong leaders set the agenda and the true followers, well, they follow.

In the name of increased functionality, such actions can create a Weltanschauung that is narrowly focused and self-justifying.  It may result in soft biases, e.g., production over safety, or ignoring problematic aspects of a situation, e.g., Davis-Besse test and inspection reports.

Fortunately, as the authors explain, a self-correcting dynamic may occur.  Initially, improved functionality contributes to a sense of certainty about the organization’s and individuals’ places in the world, thus creating positive feedback.  But eventually the organization’s view of the world may increasingly clash with reality, creating dissonance (a loss of certainty) for the organization and the individuals who inhabit it.  As the gap between perception and reality grows, the overall system becomes less stable.  When people realize that description and reality are far apart, the organization’s, i.e., management’s, legitimacy collapses.

However, in the worst case “increasingly yawning gaps between shared assumptions and reality may eventually produce accidents or disasters.” (p. 1213)  Fukushima anyone?

Our Perspective

Management is always under the gun to “do better” when things are going well or “do something” when problems occur.  In the latter case, one popular initiative is to “improve” the culture, especially if a regulator is involved.  Although management’s intentions may be beneficent, there is an opportunity for invidious elements to be introduced and/or unintended consequences to occur.

Environmental factors can encourage stupidity.  For example, quarterly financial reporting, an ever shortening media cycle and the global reach of the Internet (especially it’s most intellectually challenged component, the Twitterverse) pressure executives to project command of their circumstances and certainty about their comprehension, even if they lack adequate (or any) relevant data.

The nuclear industry is not immune to functional stupidity.  Not to put too fine a point on it, but the industry's penchant for secrecy creates an ideal Petri dish for the cultivation of stupidity management.

The authors close by saying “we hope to prompt wider debate about why it is that smart organizations can be so stupid at times.” (p. 1216)  For a long time we have wondered about that ourselves.

*  M. Alvesson and A. Spicer, “A Stupidity-Based Theory of Organizations,” Journal of Management Studies 49:7 (Nov. 2012), pp. 1194-1220.  Thanks to Madalina Tronea for publicizing this document.  Dr. Tronea is the founder/moderator of the LinkedIn Nuclear Safety Culture discussion group.

**  Following are additional definitions, with italics added, “functional stupidity is inability and/or unwillingness to use cognitive and reflective capacities in anything other than narrow and circumspect ways.” (p. 1201)  “stupidity management . . . involves the management of consciousness, clues about how to understand and relate to the world, . . .” (p. 1204)  “stupidity self-management  [consists of] the individual putting aside doubts, critique, and other reflexive concerns and focusing on the more positive aspects of organizational life which are more clearly aligned with understandings and interpretations that are officially sanctioned and actively promoted.” (p. 1207)


  1. • Elementary Failures

    An inescapable fact is that the competent investigation of every harmful event reveals that the causation of the harm includes the failure to apply elementary principles of design , human factors , human behavior technology, engineering , science, operations, communications, administration , quality , regulatory compliance, and/or management. Often there is a contemptuous/ dismissive/ arrogant/ ignorant disregard for even needing to know what these elementary principles are, much less flowing them down to where they might need to be applied.

    “They were all so busy cutting wood that they never had time to sharpen the saw.”

    “Their degrees did not reflect their knowledge.”

    “Real men do not need elementary principles.”-Overheard before the fiasco.

  2. Every nuclear enterprise institution responsible for complex, high-consequence, high-capital cost built artifacts faces a problem that naturally-occurring forces of obsolescence - occasioned by a continually evolving circumstance - are eroding the basis for confidence in the adequacy of protection margins.

    For most of the commercial nuclear energy era there has existed a substantial commitment to Quality Assurance. QA and its subordinate element QC have served admirably to ensure that new capital plant entering service is highly fit for service.

    But what we learned the hard way, starting even before TMI, is that Human Capital capacity contains more degrees of variability and demonstrates more rapid decay paths than is the case with the highly-engineered, and carefully realized bases of the physical plant.

    Put simply, engineered Reliability is more easily gained and obsolescence forestalled than is the case with encultured Resilience. Continual Betterment of current performance is a more formidable challenge than QA of the as-build plant and its technical basis.

    The industry would not be on the ropes were there any acceptance of the need for two distinct (and conceptually orthogonal) modes of governance and institutional sustainment.

  3. Thanks, Lewis.

    Functional Stupidity
    March 19, 2016

    The term “functional stupidity” is seen more and more. It can have more than one plain English meaning.

    One: Functional stupidity can be stupidity that is functional, i.e., stupidity that is or is capable of functioning, accomplishing a function.

    I used to deal with an effective manager who used functional stupidity as a tactic. In a contentious discussion he would say something obviously stupid to prompt others to counteract his statement with intelligent criticism and options.

    Two: Functional stupidity can be stupidity regarding function, functioning, and/or functions. This would include being stupid about how one or more functions are accomplished.

    The Three Mile Island Accident could be attributed to functional stupidity by noting that the operator who shut off the safety injection pumps had a flawed understanding of the function of the pumps.

    A problem with the term “functional stupidity” is that the word “stupid” is always pejorative. It is vague. It is subjective. Many professionals seem to avoid the term in favor of less pejorative, more specific, more objective terms.

    The term “stupidity” does not appear frequently in official documentation of adverse event investigations. I don’t recall seeing it ever. I do not use the term.

    I am inclined to go by the “principle of rationality” that attributes actions and inactions, in part, to people’s perceptions of how to achieve goals. I’m also fond of the notion of “local rationality” that recognizes the importance of the details of the local situation.

    The “principle of rationality” is sometimes associated with the name Karl R. Popper. “Local rationality” is associated with the names James Reason, David Wood, and others.

  4. My guess is Prof. Alvesson deliberately chose the term "stupidity" because it is pejorative. His article does not describe internal organizational conditions of which either the organization or its members should be proud.

    If one refers to the source article, it appears that Alvesson and Spicer are well aware of local or bounded rationality concepts. But I think they are trying to describe something more insidious, viz., deliberate dumbing down in an organization.

    I recoiled when I first saw "stupidity" and almost rejected the article out of hand. But after reading it, I believe they are attempting to describe some new conditions where "smart people do dumb things." It would probably go down easier if they had chosen a more neutral label.

  5. One problem is that the functionality of “functional stupidity” becomes dysfunctionality, as in the case of Fukushima, in which much money was saved by not designing the site for the tsunamis that would accompany the earthquakes to which the site was designed. This is as stupid as not designing a site for the flooding that is known to accompany a hurricane to which the site is designed. It is as stupid as designing for a certain fire, but not for the atmospheric pollutants that are know to be produced by it.

  6. At San Onofre the plant staff used thermography to verify the integrity of unenergized junctions. All unenergized junctions will look cold to thermography. Which type of functional stupidity is this?

    At Davis-Besse the plant staff “inspected” the reactor vessel head without cleaning it to bare metal thus 1) meeting ALARA goals, 2) not extending the outage, and 3) not seeing the boric acid leaks. Which type of functional stupidity is this?

    Even when functional stupidity is functional the functionality is often bounded, i.e., beyond some spatial, topical, or time application it becomes dysfunctional.


Thanks for your comment. We read them all. We would like to display them under their respective posts on our main page but that is not how Blogger works.