Friday, December 1, 2017

Nuclear Safety Culture: Focus on Decision Making



McKinsey Five Fifty cover
We have long held that decision making (DM) is a key artifact reflecting the state of a nuclear organization’s safety culture.

The McKinsey Quarterly (MQ) has packaged a trio of articles* on DM.  Their first purpose is identifying and countering the different biases that lead to sub-optimal, even disastrous decisions.  (When specific biases are widely spread in an organization, they are part of its culture.)  A second purpose is to describe the attributes of more fair, robust and effective DM processes.  The articles’ specific topics are (1) the behavioral science that underlies DM, (2) a method for categorizing and processing decisions and (3) a case study of a major utility that changed its decision culture. 

“The case for behavioral strategy” (MQ, March 2010)

This article covers the insights from psychology that can be used to fashion a robust DM process.  The authors evidence the need for process improvement by reporting their survey research results showing over 50 percent of the variability in decisional results (i.e., performance) was determined by the quality of the DM process while less than 10 percent was caused by the quality of the underlying analysis. 

There are plenty of cognitive biases that can affect human DM.  The authors discuss several of them and strategies for counteracting them, as summarized in the table below.


Type of bias
How to counteract
False pattern recognition (e.g., saliency (overweight recent or memorable events), confirmation, inaccurate analogies)
Require alternative explanations for the data in the analysis, articulate participants’ relevant experiences (which can reveal the basis for their biases), identify similar situations for comparative analysis.
Bias for action
Explicitly consider uncertainty in the input data and the possible outcomes.
Stability (anchoring to an initial value, loss aversion)
Establish stretch targets that can’t be achieved by business as usual.
Silo thinking
Involve a diverse group in the DM process and define specific decision criteria before discussions begin.
Social (conformance to group views)
Create genuine debate through a diverse set of decision makers, a climate of trust and depersonalized discussions.


The greatest problem arises from biases that create repeatable patterns that become undesirable cultural traits.  DM process designers must identify the types of biases that arise in their organization’s DM, and specify debiasing techniques that will work in their organization and embed them in formal DM procedures.

An attachment to the article identifies and defines 17 specific biases.  Much of the seminal research on DM biases was performed by Daniel Kahneman who received a Nobel prize for his efforts.  We have reviewed Prof. Kahneman’s work on Safetymatters; see our Nov. 4, 2011 and Dec. 18, 2013 posts or click on the Kahneman label. 

“Untangling your organization’s decision making” (MQ, June 2017)

While this article is aimed at complex, global organizations, there are lessons here for nuclear organizations (typically large bureaucracies) because all organizations have become victims of over-abundant communications, with too many meetings and low value e-mail threads distracting members from paying attention to making good decisions.

The authors posit four types of decisions an organization faces, plotted on a 2x2 matrix (the consultant’s best friend) with scope and impact (broad or narrow) on one axis and level of familiarity (infrequent or frequent) on the other.  A different DM approach is proposed for each quadrant. 

Big-bet decisions are infrequent and have broad impact.  Recommendations include (1) ensure there’s an executive sponsor, (2) break down the mega-decision into manageable parts for analysis (and reassemble them later), (3) use a standard DM approach for all the parts and (4) establish a mechanism to track effectiveness during decision implementation.

The authors observe that some decisions turn out to be “bet the company” ones without being recognized as such.  There are examples of this in the nuclear industry.  For details, see our June 18,2013 post on Kewaunee (had only an 8 year PPA), Crystal River (tried to cut through the containment using in-house expertise) and SONGs (installed replacement steam generators with an unproven design). 

Cross-cutting decisions are more frequent and have broad impact.  Some decisions at a nuclear power plant fall into this category.  They need to have the concurrence and support of the Big 3 stakeholders (Operations, Engineering and Maintenance).  Silo attitudes are an omnipresent threat to success in making these kinds of decisions.  The key is to get the stakeholders to agree on the main process steps and define them in a plain-English procedure that defines the calendar, handoffs and decisions.  Governing policy should establish the DM bodies and their authority, and define shared performance metrics to measure success. 

Delegated decisions are frequent and low-risk.  They can be effectively handled by an individual or working team, with limited input from others.  The authors note “The role-modeling of senior leaders is invaluable, but they may be reluctant” to delegate.  We agree.  In our experience, many nuclear managers were hesitant to delegate as many decisions as they could have to subordinates.  Their fear of being held accountable for a screw-up was just too great.  However, their goal should have been to delegate all decisions except those for which they alone had the capabilities and accountability.  Subordinates need appropriate training and explicit authority to make their decisions and they need to be held accountable by higher-level managers.  The organization needs to establish a clear policy defining when and how a decision should be elevated to a more senior decision maker. 

Ad hoc decisions are infrequent and low-risk; they were deliberately omitted from the article. 

“A case study in combating bias” (MQ, May 2017)

This is an interview with a senior executive of a German utility that invested €10 billion in conventional power projects, investments that failed when the political-economic environment evolved in a direction opposite to their assumptions.  In their postmortem, they realized they had succumbed to several cognitive biases, including status quo, confirmation, champion and sunflower.  The sunflower bias (groups aligning with their leaders) stretched far down the organizational hierarchy so lower-level analysts didn’t dare to suggest contrary assumptions or outcomes.

The article describes how the utility made changes to their DM practices to promote awareness of biases and implement debiasing techniques, e.g, one key element is officially designated “devil’s advocates” in DM groups.  Importantly, training emphasizes that biases are not some personal defect but “just there,” i.e., part of the culture.  The interviewee noted that the revised process is very time-intensive so it is utilized only for the most important decisions facing each user group. 

Our Perspective 

The McKinsey content describes executive level, strategic DM but many of the takeaways are equally applicable to decisions made at the individual, department and inter-department level, where a consistent approach is perhaps even more important in maintaining or improving organizational performance.

The McKinsey articles come in one of their Five Fifty packages, with a summary you can review in five minutes and the complete articles that may take fifty minutes total.  You should invest at least the smaller amount.


*  “Better Decisions,” McKinsey Quarterly Five Fifty.  Retrieved Nov. 28, 2017.

Tuesday, November 21, 2017

Any Lessons for Nuclear Safety Culture from VW’s Initiative to Improve Its Compliance Culture?

VW Logo (Source: Wikipedia)
The Wall Street Journal (WSJ) recently published an interview* with the head of the new compliance department in Volkswagen’s U.S. subsidiary.  The new executive outlined the department’s goals and immediate actions related to improving VW’s compliance culture.  They will all look familiar to you, including a new organization (headed by a former consultant) reporting directly to the CEO and with independent access to the board; mandatory compliance training; a new code of conduct; and developing a questioning attitude among employees.  One additional attribute deserves a brief expansion.  VW aims to improve employees’ decision making skills.  We’re not exactly sure what that means but if it includes providing more information about corporate policies and legal, social and regulatory expectations (in other words, the context of decisions) then we approve.

Our Perspective 


These interventions could be from a first generation nuclear safety culture (NSC) handbook on efforts to demonstrate management interest and action when a weak culture is recognized.  Such activities are necessary but definitely not sufficient to strengthen culture.  Some specific shortcomings follow.

First, the lack of reflection.  When asked about the causes of VW’s compliance failures, the executive said “I can’t speculate on the failures . . .”  Well, she should have had something to say on the matter, even party line bromides.  We’re left with the impression she doesn’t know, or care, about the specific and systemic causes of VW’s “Dieselgate” problems that are costing the company tens of billions of dollars.  After all, this interview was in the WSJ, available to millions of critical readers, not some trade rag.

Second, the trust issue.  VW wants employees who can be trusted by the organization, presumably to do “the right thing” as they go about their business.  That’s OK but it’s even more important to have senior managers who can be trusted to do the right thing.  This is especially relevant for VW because it’s pretty clear the cheating problems were tolerated, if not explicitly promoted, by senior management; in other words, there was a top-down issue in addition to lower-level employee malfeasance.

Next, the local nature of the announced interventions.  The new compliance department is for VW-USA only.  The Volkswagen Group of America includes one assembly plant, sales and maintenance support functions, test centers and VW’s consumer finance entity.  It’s probably safe to say that VW’s most important decisions regarding corporate practices and product engineering are made in Wolfsburg, Lower Saxony and not Herndon, Virginia.

Finally, the elephant in the room.  There is no mention of VW’s employee reward and recognition system or the senior management compensation program.  We have long argued that employees focus on actions that will secure their jobs (and perhaps lead to promotions) while senior managers focus on what they’re being paid to accomplish.  For the latter group in the nuclear industry, that’s usually production with safety as a should-do but with little, if any, money attached.  We don’t believe VW is significantly different.

Bottom line: If this WSJ interview is representative of the auto industry’s understanding of culture, then once again nuclear industry thought leaders have a more sophisticated and complete grasp of cultural dynamics and nuances.

We have commented before on the VW imbroglio.  See our Dec. 20, 2015 and May 31, 2016 posts or click on the VW label.


*B. DiPietro, “Working to Change Compliance Culture at Volkswagen,” Wall Street Journal (Nov. 16, 2017).

Monday, October 30, 2017

Nuclear Safety Culture Under Assault: DNFSB Chairman Proposes Eliminating the Board


DNFSB headquarters
The Center for Public Integrity (CPI) recently published a report* that disclosed a private letter** from Sean Sullivan, the Chairman of the Defense Nuclear Facilities Safety Board (DNFSB) to the Director of the Office of Management and Budget in which the chairman proposed abolishing or downsizing the DNFSB.  The CPI is highly critical of the chairman’s proposals; support for their position includes a list of the safety improvements in the Department of Energy (DOE) complex that have resulted from DNFSB recommendations and the safety challenges that DOE facilities continue to face.

The CPI also cites a 2014 National Nuclear Security Administration (NNSA, the DOE sub-organization that oversees the nuclear weapons facilities) internal report that describes NNSA’s own safety culture weaknesses, e.g., lack of a questioning attitude toward contractor management’s performance claims, with respect to its oversight of the Los Alamos National Laboratory.

The CPI believes the chairman is responding to pressure from the private contractors who actually manage DOE facilities to reduce outside interference in, and oversight of, contractor activities.  That’s certainly plausible.  The contractors get paid regardless of their level of performance, and very little of that pay is tied to safety performance.  DNFSB recommendations and reports can be thorns in the sides of contractor management.

The Sullivan Letter

The primary proposal in the Sullivan letter is to abolish the DNFSB because the DOE has developed its own “robust regulatory structure” and oversight capabilities via the Office of Enterprise Assessments.  That’s a hollow rationale; the CPI report discusses the insufficiency of DOE’s own assessments.  If outright elimination is not politically doable then DNFSB personnel could be transferred to DOE, sustaining the appearance of independent oversight, and then be slowly absorbed into the larger DOE organization.  That is not a path to increased public confidence and looks like being assimilated by the Borg.***  The savings that could be realized from abolishing the DNFSB is estimated at $31 million, a number lost in the decimal dust of DOE’s $30+ billion budget.

Sullivan mentions but opposes transferring the DNFSB’s oversight responsibilities to the Nuclear Regulatory Commission.  Why?  Because the NRC is not only independent, it has enforcement powers which would be inappropriate for defense nuclear facilities and might compromise national security.  That’s a red herring but we’ll let it go; we don’t think oversight of defense facilities really meshes with the NRC’s mission.

His secondary proposal is to downsize the DNFSB workforce, especially its management structure, and transfer most of the survivors to specific defense facilities.  While we think DNFSB needs more resources, not fewer, it would be better if more DNFSB personnel were located in the field, keeping track of and reporting on DOE and contractor activities.

Our Perspective

Safetymatters first became interested in the DNFSB when we saw the growing mess at the Waste Treatment Plant (WTP, aka the Vit Plant) in Hanford, WA.  It was the DNFSB who forced the DOE and its WTP contractors to confront and remediate serious nuclear safety culture (NSC) problems.  We have published multiple reports on the resultant foot-dragging by DOE in its responses to DNFSB Recommendation 2011-1 which addressed safety conscious work environment (SCWE) problems at Hanford and other DOE facilities.  Click on the DOE label to see our offerings on WTP, other DOE facilities and the overall DOE complex.
 
We have reported on the NSC problems at the Waste Isolation Pilot Plant (WIPP) in New Mexico.  The DNFSB has played an important role in attempting to get DOE and the WIPP contractor to strengthen their safety practices.  Click the WIPP label to see our WIPP-related posts. 

We have also covered a report on the DNFSB’s own organizational issues, including board members’ meddling in day-to-day activities, weak leadership and too-frequent organizational changes.  See our Feb. 6, 2015 post for details.

DNFSB’s internal issues notwithstanding, the board plays an indispensible role in strengthening NSC and safety practices throughout the DOE complex.  They should be given greater authority (which won’t happen), stronger leadership and additional resources.

Bottom line: Sullivan’s proposal is just plain nuts.  He’s a Republican appointee so maybe he’s simply offering homage to his ultimate overlord.
  

*  P. Malone and R.J. Smith, “GOP chair of nuclear safety agency secretly urges Trump to abolish it,” The Center for Public Integrity (Oct. 19, 2017).  Retrieved Oct. 26, 2017.

**  S. Sullivan (DNFSB) to J.M Mulvaney (Management and Budget), no subject specified but described as an “initial high-level draft of [an] Agency Reform Plan” (June 29, 2019).  Available from the CPI in html and pdf format.  Retrieved Oct. 26, 2017.

***  The Borg is an alien group entity in Star Trek that forcibly assimilates other beings.  See Wikipedia for more information.

Monday, October 16, 2017

Nuclear Safety Culture: A Suggestion for Integrating “Just Culture” Concepts

All of you have heard of “Just Culture” (JC).  At heart, it is an attitude toward investigating and explaining errors that occur in organizations in terms of “why” an error occurred, including systemic reasons, rather than focusing on identifying someone to blame.  How might JC be applied in practice?  A paper* by Shem Malmquist describes how JC concepts could be used in the early phases of an investigation to mitigate cognitive bias on the part of the investigators.

The author asserts that “cognitive bias has a high probability of occurring, and becoming integrated into the investigators subconscious during the early stages of an accident investigation.” 

He recommends that, from the get-go, investigators categorize all pertinent actions that preceded the error as an error (unintentional act), at-risk behavior (intentional but for a good reason) or reckless (conscious disregard of a substantial risk or intentional rule violation). (p. 5)  For errors or at-risk actions, the investigator should analyze the system, e.g., policies, procedures, training or equipment, for deficiencies; for reckless behavior, the investigator should determine what system components, if any, broke down and allowed the behavior to occur. (p. 12).  Individuals should still be held responsible for deliberate actions that resulted in negative consequences.

Adding this step to a traditional event chain model will enrich the investigation and help keep investigators from going down the rabbit hole of following chains suggested by their own initial biases.

Because JC is added to traditional investigation techniques, Malmquist believes it might be more readily accepted than other approaches for conducting more systemic investigations, e.g., Leveson’s System Theoretic Accident Model and Processes (STAMP).  Such approaches are complex, require lots of data and implementing them can be daunting for even experienced investigators.  In our opinion, these models usually necessitate hiring model experts who may be the only ones who can interpret the ultimate findings—sort of like an ancient priest reading the entrails of a sacrificial animal.  Snide comment aside, we admire Leveson’s work and reviewed it in our Nov. 11, 2013 post.

Our Perspective

This paper is not some great new insight into accident investigation but it does describe an incremental step that could make traditional investigation methods more expansive in outlook and robust in their findings.

The paper also provides a simple introduction to the works of authors who cover JC or decision-making biases.  The former category includes Reason and Dekker and the latter one Kahneman, all of whom we have reviewed here at Safetymatters.  For Reason, see our Nov. 3, 2014 post; for Dekker, see our Aug. 3, 2009 and Dec. 5, 2012 posts; for Kahneman, see our Nov. 4, 2011 and Dec. 18, 2013 posts.

Bottom line: The parts describing and justifying the author’s proposed approach are worth reading.  You are already familiar with much of the contextual material he includes.  


*  S. Malmquist, “Just Culture Accident Model – JCAM” (June 2017).

Friday, October 6, 2017

WANO and NEA to Cooperate on Nuclear Safety Culture

World Nuclear News Oct. 4, 2017
According to an item* in World Nuclear News, the World Association of Nuclear Operators (WANO) and the Organisation for Economic Co-operation and Development’s Nuclear Energy Agency (NEA) signed a memorandum of understanding to cooperate on "the further development of approaches, practices and methods in order to proactively strengthen global nuclear safety."

One objective is to “enhance the common understanding of nuclear safety culture challenges . . .”  In addition, the parties have identified safety culture (SC) as a "fundamental subject of common interest" and plan to launch a series of "country-specific discussions to explore the influence of national culture on the safety culture".

Our Perspective

As usual, the press release touts all the benefits that are going to flow from the new relationship.  We predict the flow will be at best a trickle based on what we’ve seen from the principals over the years.  Following is our take on the two entities.

WANO is an association of the world's nuclear power operators.  Their objective is to exchange safety knowledge and operating experience among its members.  We have mentioned WANO in several Safetymatters posts, including Jan. 23, 2015, Jan. 7, 2015, Jan. 21, 2014 and May 1, 2010.  Their public contributions are generally shallow and insipid.  WANO may be effective at facilitating information sharing but it has no real authority over operators.  It is, however, an overhead cost for the economically uncompetitive commercial nuclear industry. 

NEA is an intergovernmental agency that facilitates cooperation among countries with nuclear technology infrastructures.  In our March 3, 2016 post we characterized NEA as an “empty suit” that produces cheerleading and blather.  We stand by that assessment.  In Safetymatters’ history, we have come across only one example of NEA adding value—when they published a document that encouraged regulators to take a systems view of SC.  See our Feb. 10, 2016 post for details.

No one should expect this new arrangement to lead to any breakthroughs in SC theory or insights into SC practice.  It will lead to meetings, conferences, workshops and boondoggles.  One hopes it doesn’t indirectly raise the industry’s costs or, more importantly, distract WANO from its core mission of sharing safety information and operating experience across the international nuclear industry. 


*  “WANO, NEA enhance cooperation in nuclear safety,” World Nuclear News (Oct. 4, 2017).

Tuesday, September 26, 2017

“New” IAEA Nuclear Safety Culture Self-Assessment Methodology

IAEA report cover
The International Atomic Energy Agency (IAEA) touted its safety culture (SC) self-assessment methodology at the Regulatory Cooperation Forum held during the recent IAEA 61st General Conference.  Their press release* refers to the methodology as “new” but it’s not exactly fresh from the factory.  We assume the IAEA presentation was based on a publication titled “Performing Safety Culture Self-assessments”** which was published in June 2016 and we reviewed on Aug. 1, 2016.  We encourage you to read our full review; it is too lengthy to reasonably summarize in this post.  Suffice to say the publication includes some worthwhile SC information and descriptions of relevant SC assessment practices but it also exhibits some execrable shortcomings.


*  IAEA, “New IAEA Self-Assessment Methodology and Enhancing SMR Licensing Discussed at Regulatory Cooperation Forum” (Sept. 22, 2017).

**  IAEA, “Performing Safety Culture Self-assessments,” Safety Reports Series no. 83 (Vienna: IAEA, 2016).

Thursday, August 10, 2017

Nuclear Safety Culture: The Threat of Bureaucratization

We recently read Sidney Dekker’s 2014 paper* on the bureaucratization of safety in organizations.  It’s interesting because it describes a very common evolution of organizational practices, including those that affect safety, as an organization or industry becomes more complicated and formal over time.  Such evolution can affect many types of organizations, including nuclear ones.  Dekker’s paper is summarized below, followed by our perspective on it. 

The process of bureaucratization is straightforward; it involves hierarchy (creating additional layers of organizational structure), specialized roles focusing on “safety related” activities, and the application of rules for defining safety requirements and the programs to meet them.  In the safety space, the process has been driven by multiple factors, including legislation and regulation, contracting and the need for a uniform approach to managing large groups of organizations, and increased technological capabilities for collection and analysis of data.

In a nutshell, bureaucracy means greater control over the context and content of work by people who don’t actually have to perform it.  The risk is that as bureaucracy grows, technical expertise and operational experience may be held in less value.

This doesn’t mean bureaucracy is a bad thing.  In many environments, bureaucratization has led to visible benefits, primarily a reduction in harmful incidents.  But it can lead to unintended, negative consequences including:

  • Myopic focus on formal performance measures (often quantitative) and “numbers games” to achieve the metrics and, in some cases, earn financial bonuses,
  • An increasing inability to imagine, much less plan for, truly novel events because of the assumption that everything bad that might happen has already been considered in the PRA or the emergency plan.  (Of course, these analyses/documents are created by siloed specialists who may lack a complete understanding of how the socio-technical system works or what might actually be required in an emergency.  Fukushima anyone?),
  • Constraints on organizational members’ creativity and innovation, and a lack of freedom that can erode problem ownership, and
  • Interest, effort and investment in sustaining, growing and protecting the bureaucracy itself.
Our Perspective

We realize reading about bureaucracy is about as exciting as watching a frog get boiled.  However, Dekker does a good job of explaining how the process of bureaucratization takes root and grows and the benefits that can result.  He also spells out the shortcomings and unintended consequences that can accompany it.

The commercial nuclear world is not immune to this process.  Consider all the actors who have their fingers in the safety pot and realize how few of them are actually responsible for designing, maintaining or operating a plant.  Think about the NRC’s Reactor Oversight Process (ROP) and the licensees’ myopic focus on keeping a green scorecard.  Importantly, the Safety Culture Policy Statement (SCPS) being an “expectation” resists the bureaucratic imperative to over-specify.  Instead, the SCPS is an adjustable cudgel the NRC uses to tap or bludgeon wayward licensees into compliance.  Foreign interest in regulating nuclear safety culture will almost certainly lead to its increased bureaucratization.  

Bureaucratization is clearly evident in the public nuclear sector (looking at you, Department of Energy) where contractors perform the work and government overseers attempt to steer the contractors toward meeting production goals and safety standards.  As Dekker points out, managing, monitoring and controlling operations across an organizational network of contractors and sub-contractors tends to be so difficult that bureaucratized accountability becomes the accepted means to do so.

We have presented Dekker’s work before, primarily his discussion of a “just culture” (reviewed Aug. 3, 2009) that tries to learn from mishaps rather than simply isolating and perhaps punishing the human actor(s) and “drift into failure” (reviewed Dec. 5, 2012) where a socio-technical system can experience unacceptable performance caused by systemic interactions while functioning normally.  Stakeholders can mistakenly believe the system is completely safe because no errors have occurred while in reality the system can be slipping toward an incident.  Both of these attributes should be considered in your mental model of how your organization operates.

Bottom line: This is an academic paper in a somewhat scholarly journal, in other words, not a quick and easy read.  But it’s worth a look to get a sense of how the tentacles of formality can wrap themselves around an organization.  In the worse case, they can stifle the capabilities the organization needs to successfully react to unexpected events and environmental changes.


*  S.W.A. Dekker, “The bureaucratization of safety,” Safety Science 70 (2014), pp. 348–357.  We saw this paper on Safety Differently, a website that publishes essays on safety.  Most of the site’s content appears related to industries with major industrial safety challenges, e.g., mining.

Thursday, July 27, 2017

Nuclear Safety Culture: Another Incident at Pilgrim: Tailgate Party

Pilgrim
The Cape Cod Times recently reported* on a security violation at the Pilgrim nuclear plant: one employee entering a secure area facilitated “tailgating” by a second employee who had forgotten his badge.  He didn’t want to go to Security to obtain clearance for entry because that would make him late for work.

The NRC determined the pair were deliberately taking a shortcut but were not attempting to do something malicious.  The NRC investigation also revealed that other personnel, including security, had utilized the same shortcut in the past to allow workers to exit the plant.  The result of the investigation was a Level IV violation for the plant.

Of course, the plant’s enemies are on this like a duck on a June bug, calling the incident alarming and further evidence for immediate shutdown of the plant.  Entergy, the plant’s owner, is characterized as indifferent to such activities. 

The article’s high point was reporting that the employee who buzzed in his fellow worker told investigators “he did not know he was not allowed to do that”.

Our Perspective 


The incident itself was a smallish deal, not a big one.  But it does score a twofer because it reflects on both safety culture and security culture.  Whichever category it goes in, the incident is a symptom of a poorly managed plant and a culture that has long tolerated shortcuts.  It is one more drop in the bucket as Pilgrim shuffles** toward the exit.

This case raises many questions: What kind of training, including refresher training, does staff receive about security procedures?  What kind of oversight, reminders, reinforcement and role modeling do they get from their supervisors and higher-level managers?  Why was the second employee reluctant to take the time to follow the correct procedure?  Would he have been disciplined, or even fired, for being late?  We would hope Pilgrim management doesn’t put everyone who forgets his badge in the stocks, or worse.

Bottom line: Feel bad for the people who have to work in the Pilgrim environment, be glad it’s not you or your workplace.


*  C. Legere, “NRC: Pilgrim workers ‘deliberately’ broke rules,” Cape Cod Times (July 24, 2017).  Retrieved July 26, 2017

**  In this instance, “shuffle” has both its familiar meaning of “dragging one's feet” and a less-used definition of “avoid a responsibility or obligation.”  Google dictionary retrieved July 27, 2017.