When Things Go Wrong in Healthcare

We Punish the Hands Holding the Swiss Cheese and Ignore the Factory That Makes It Full of Holes

Healthcare, for all its technology, is primitive in one key area: it cannot see its own systems. We measure people in detail but see systems only in outline. We have sharp tools for judging individual performance but almost nothing to monitor the health of the system itself.

This isn’t a small oversight; it’s a fundamental design flaw. It means that when harm occurs, we instinctively blame the person closest to the event, while the systemic failures that made the error inevitable remain hidden. This is best understood through the lens of the Swiss Cheese Model, which shows how disasters happen when holes in successive layers of defence line up.

In healthcare, however, we are conditioned to only see the final person holding the last slice of cheese. This article explains how this happens, why our focus is so dangerously narrow, and what it would take to finally see the whole system.

The Anatomy of a System Failure

Imagine a fatal medication error. A nurse is suspended after administering the wrong dose. The investigation concludes with “human error,” and the case is closed. This provides a sense of resolution, but it ignores the real story.

The story our system fails to record is that the high-concentration vial was stored next to the standard one (a latent error), the electronic warning was overridden due to alert fatigue (a design flaw), and the nurse was covering two units due to short staffing (an operational pressure). These are the holes in the Swiss cheese. The nurse didn’t cause the failure; they were the final person to inherit a chain of system-level defects.

The Loop of Blame

This is the loop of blame. Because we look for individual failings, we find them. Each investigation that ends by blaming a person strengthens the belief that error is a personal, moral failing. Hindsight bias makes it worse, turning a decision made under pressure and with incomplete information into an obvious mistake. We retrain the surgeon for a wrong-site surgery but leave the rushed, interruption-prone “time-out” process unchanged, ready to trap the next person.

Our system is wired with blinded sensors. It can detect the pharmacist or the nurse, but it is blind to the holes they are forced to work around: confusing packaging, similar drug names, and missing decision support. We punish the hands holding the cheese and ignore the factory that makes it full of holes.

A System Designed to Blame

This systemic blindness isn’t accidental; it’s embedded in our culture, structures, and tools.

  • Narrative Bias: Simple stories with villains are easier to tell and understand than complex stories of systemic failure. “The doctor forgot” is a simpler narrative than “The diagnostic pathway is fragmented across three different IT systems.”
  • Measurement Bias: It’s easy to count an individual’s errors. It’s far harder to measure team resilience, the usability of an electronic record, or the psychological safety of a ward. We manage what we measure, and we only measure people.
  • Accountability Structures: Our systems are built around individual responsibility.
    • The name above the bed assigns responsibility to a single consultant, even if care is delivered by a sprawling, often disconnected team.
    • Appraisals and job plans focus on personal activities and portfolios, not on the health of the teams or processes that individual works within.
    • When a poor referral pathway leads to a delayed diagnosis, there is no clear owner to hold accountable, so we default to blaming the last clinician who saw the patient.

These forces relentlessly pull our attention toward the individual, pushing the system and its flaws into the background.

Developing System Sight

To stop blaming individuals, we must learn to see our systems. This requires a deliberate shift in focus, from punishing people to improving the conditions they work in. This isn’t about removing accountability; it’s about redefining it.

Embrace a Just Culture

First, we must adopt a Just Culture — a framework for differentiating between honest human error (which requires system redesign), at-risk behaviour driven by broken processes (which requires coaching), and genuine recklessness (which requires accountability). This gives us a way to be both compassionate and responsible, ensuring we only punish a conscious disregard for safety, not the predictable errors our own systems create.

A New Dashboard for Safety

Second, we need to track the vital signs of the organisation itself. The following are a few examples of metrics that can be used as early warning indicators of a system under stress, allowing us to diagnose problems before they cause harm (and there are many other system level metrics in healthcare):

  • System Stress & Staff Wellbeing: Regular, anonymous measures of burnout and fatigue are essential. A burned-out workforce isn’t a collection of personal failings; it’s a symptom of an overloaded system.
  • Improvised routines and variations in practice: Every time staff bypass a procedure, often out of necessity, it may signal a broken process. Workarounds should be logged and analysed, not ignored—and certainly not criticised by default. We must also check how often critical safety processes, such as surgical time-outs or patient handovers, are carried out as designed. This reveals the gap between “work as imagined” in protocols and “work as done” on the front line. The task is then to identify and address the underlying system-level causes of this difference.
  • Psychological Safety: Teams must feel safe to speak up and report errors without fear of punishment. Without this, data on safety is incomplete, because staff will only share good news.
  • Rapid and Accurate Differentiation of Error — emphasising the need to quickly and reliably distinguish between system-caused errors (the majority) and deliberate harm (a very small minority), since low specificity in this process causes unnecessary and damaging investigations, adding further psychological harm to both patients and staff.

Conclusion

Making the system visible is a duty of leadership. Boards and senior executives must review system health with the same seriousness they apply to finance. Their first question should not be, “Who was at fault?” but, “How did our system fail this patient and our staff?”

We have built a powerful machine for finding fault in people but remain blind to the systemic faults that produce that “human error.” This guarantees we will continue to find a final person to blame—the nurse who gave the drug, the surgeon who operated on the wrong site, the doctor who missed the critical detail. These people are not the cause of harm but its inheritors.

Punishing them does nothing to fix the chain of failure. It only ensures it will form again for the next patient, with a different person at its end. We see our people in fine detail, but our systems only in outline. Until we bring our systems into sharp focus, the next tragedy is not a matter of if, but of who.

From Blame to Systems Thinking: Tackling the Fundamental Attribution Error

Core Concept

The Fundamental Attribution Error (FAE) is a common but often overlooked cognitive bias with significant implications in surgical settings. It arises when we attribute colleagues’ errors or behaviours to personal traits or competence (e.g. “the surgeon is careless” or “the anaesthetist is incompetent”) while explaining our own errors by external circumstances (e.g. “The list was delayed because the first case was a complex obese patient and I lacked adequate assistance” or “I didn’t write in the notes because I was too busy with other patients”). Nowhere is this more evident than in poorly run M&M discussions, where the clinician under review often highlights external factors, while others focus on individual human errors.

Why It Happens in Surgery

The key issue is a failure to recognise that most imperfections or errors stem from the systems that govern the environment, rather than from inherent abilities or personal traits. In the operating theatre, pressures such as time constraints, hierarchy, ad-hoc teams, and limited awareness of colleagues’ workload, fatigue, or competing demands provide fertile ground for this bias.

Impact in the Clinical Environment

Unchecked, the FAE can directly undermine patient safety and team dynamics by:

  • Creating a blame culture: Focusing on individual fault instead of system flaws deters openness, learning, and improvement.
  • Damaging team cohesion: Even thinking of colleagues as “slow,” “inattentive,” “unreliable,” or “incompetent” undermines trust and erodes psychological safety.

Strategies to Mitigate FAE

  1. Assume good intent – begin with the presumption that colleagues are competent and working in good faith.
  2. Ask yourself structured, system-based questions such as: I)People: Was the team appropriately staffed and rested? Were roles clear? II)Tasks: Were protocols, checklists, or guidelines available and followed? (III)Environment: Was equipment functional and accessible? Were there distractions, interruptions, or time pressures? (IV)Communication: Was the handover complete? Was information shared clearly and consistently? (V)Organisation: Were scheduling, workload distribution, and support systems adequate?
  3. Use of structured communication in regular face-to-face team meetings helps share context and reduce assumptions. Providing safe informal spaces, such as staff rooms where clinical teams can meet regularly over coffee, foster better understanding among colleagues.
  4. Debrief with empathy – essential to an effective M&M process. Focus on what happened rather than who failed, ensuring all perspectives are heard.

Key Takeaway

When reflecting on our own shortcomings, the responsibility is twofold: to acknowledge our part honestly and to identify the system issues that contributed, so they can be addressed. Equally, when considering the errors of others, we have a duty to extend the same systems-based lens rather than defaulting to personal blame. By holding ourselves and our colleagues to this standard, we move from a culture of excuses or blame to one of shared learning and collective responsibility.

Embedding Safety in Action: Why Surgeons Should Embrace ‘Pointing and Calling’ in the Operating Theatre

In high-risk environments like the operating theatre, situational awareness and anticipatory thinking are essential. While standard safety protocols such as checklists and briefings are now routine, some surgeons have adopted an additional layer of safety: verbalising anticipated errors at key stages of surgery. This proactive technique, though informal and inconsistently adopted, has been observed to sharpen team focus, reduce the likelihood of mishap, and promote collective awareness during complex procedures.

It is therefore affirming to note that this very practice—known in Japan as Pointing and Calling (Shisa Kanko)—is already embedded and widely employed in high-stakes industries such as rail and aviation, where it has demonstrated striking effectiveness in reducing human error (https://youtu.be/RZun7IvqMvE?si=3TbWsamePGl9IFza).

What is Pointing and Calling?

Pointing and Calling involves physically pointing at a critical object or parameter and simultaneously verbalising its name or state before taking action. For example, a Japanese train driver will point at a speedometer and say the speed aloud before confirming a control input. This ‘ritualistic confirmation’ creates a multisensory engagement—visual, auditory, and motor—that improves attention and reduces oversight.

Evidence of Impact

Research from the Japanese Railway Technical Research Institute shows that Pointing and Calling can reduce errors by up to 85% in repetitive task settings. Additional studies confirm its utility even when tasks are routine or unchanged, due to its role in reinforcing attentional control and reducing cognitive slips. Notably, despite the extra physical actions involved, users do not report an increased sense of workload.

Parallels in Surgery

In surgery, a comparable practice already exists among some surgeons: deliberately verbalising the potential pitfalls or risks during distinct procedural phases. For example, a surgeon may say aloud, “This is the common bile duct—risk of injury here,” or, “Risk of bleeding as we divide this plane.” These utterances serve not only as personal prompts but also cue the entire team to heighten vigilance and anticipate the next steps.

Such verbalisation operates in the same cognitive space as Pointing and Calling—bringing intention and risk into the open, using the body and voice to anchor thought in action. It externalises awareness, reducing dependence on silent internal memory and instead engaging the collective focus of the operating team.

Why Broader Use Matters

Given the mounting complexity of modern surgery—minimally invasive techniques, robotic platforms, multidisciplinary teams—anything that enhances real-time safety awareness is worth pursuing. Pointing and Calling offers a structured, embodied tool for precisely that. Its broader use in surgical theatres could standardise the informal yet effective habits already in place among some surgeons.

The challenge may lie in cultural barriers: fear of seeming overly cautious, reluctance to appear theatrical, or simply unfamiliarity with the technique. Yet these can be addressed through training, modelling, and integration into existing safety culture.

Conclusion

Verbalising potential surgical mishaps is already practised with benefit by some surgeons. Recognising its alignment with the well-established method of Pointing and Calling in other high-stakes domains reinforces its value. Far from being superfluous, this embodied rehearsal of intention and caution should be viewed as a cognitive safety net—one we would do well to adopt more widely in the operating theatre.

Acute Limb Ischaemia

https://www.podbean.com/media/share/pb-wchij-aaf247

In this episode, Mr Afshin Alijani will be asking Mr Stuart Suttie, consultant vascular surgeon in Dundee, to describe the management of an acutely ischaemic limb. The topics covered include how to diagnose acute limb ischaemia from the history and examination, how to differentiate a salvageable from a non-salvageable ischaemic limb, and the technique of arterial embolectomy.

Abdominal Wall Closure

https://www.podbean.com/media/share/pb-dxb9a-aa14ac

The Lead article of the British Journal of Surgery Feb 2019 issue was on Abdominal Wall Closure.  The author subsequently participated in a lively debate on #bjsconnect Twitter chat on 27.02.19. This podcast is a summary of that debate with a number of additions and expansions, including the Jenkins’ rule of wound closure and the appraisal of some the current evidence. Text and key references.

Bile Duct Injury

https://www.podbean.com/media/share/pb-bmk6v-a8a1d1

In this episode, Mr Afshin Alijani, a consultant surgeon at Ninewells Hospital, Dundee will be talking about bile duct injury. He will describe the two main root causes for sever injuries, namely disorientation and confirmation bias. Four surgical strategies are then offered to avoid such errors. The talk will conclude with a description of what to do intra-operatively in case of an injury to the bile duct. 

Chest Trauma – Clamshell Incision, a Step-by-Step Description of the Technique

Introduction

The aim of this post is to give an account of the steps involved in performing a resuscitative clamshell thoracotomy for chest trauma. I will highlight some of the relevant technical as well as a few non-technical pitfalls. Please refer to my  previous post on the indications for a clamshell incision.

It is exceptional for any one civilian general surgeon, working outwith a large trauma centre, to perform more than a handful of resuscitative trauma thoracotomies in the course of her/his career. As a result, this limited personal experience needs to be supplemented by simulation training on anaesthetised animals and/or human cadavers.

The less experienced surgeon has a significant psychological barrier to overcome before deciding to perform a resuscitative thoracotomy. The surgeon’s unease to act and the inevitable delay in the decision making that follows, could be further reinforced by the cognitive biases  such as the “diagnostic momentum” bias. In diagnostic momentum, the surgeon continues a clinical course of action already instigated by the team without considering changing the plan, particularly when the original plan is commenced by a senior clinician. Diagnostic momentum is a potential bias for the unwary junior member of any multidisciplinary team.

It is worth remembering that the clamshell incision is only a means to an end, and the main focus should be on the surgical manoeuvres potentially needed for damage control following gaining access. Resuscitative thoracotomy should ideally be done in a well-equipped environment and only by those who have the necessary experience in dealing with the range of injuries commonly encountered.

I hope it is clear from the above description that the challenges of performing a resuscitative thoracotomy include the ability to decide when to operate, possessing the technical skills for performing a thoracotomy incision, and the surgical experience to know what to do once in the chest.

Clamshell incision

  1. If the patient is stable enough, transfer the patient to the operating theatre. This will delay the procedure but with a trade-off of a far superior surgical environment. If the patient is unstable, proceed to performing an emergency room thoracotomy.
  2. Use an antiseptic solution to rapidly sterilise the anterior and lateral thoracic wall.
  3. Wear sterile gloves, eye protection and a gown.
  4. It is essential to remain calm; give clear instructions.
  5. Think one step ahead at all times and verbalise your thoughts for the team.
  6. Don’t injure yourself or others!
  7. Work in a well-lit area.
  8. Privacy from the gaze of the other patients/relatives is essential.
  9. The anaesthetist should ET tube the patient with both arms positioned above the patient’s head. The latter will help to maximise the intercostal spaces.
  10. Stand to the patient’s left side.
  11. having an assistance is a luxury.
  12. Use any large (No. 21) blade.
  13. Diathermy has no use.
  14. Feel for the xiphoid process and locate the inframammary folds (in females you may have to lift the breasts up).
  15. Make an incision along the left 4th or 5th intercostal space over the inferior breast crease, starting an inch lateral to the edge of the sternum and finishing at the mid axillary line. The incision should be well above the costal angle and above the xiphisternum.
  16. Make confident, deep, and long strokes with the scalpel. Avoid making light, multiple strokes; it slows you down and makes the wound bleed more.
  17. Ignore any bleeding points from the skin, fat, serratus anterior and intercostal muscles.
  18. Use heavy Mayo-type scissors to complete the division of the intercostal muscles and pleura.
  19. Use a Finochietto rib spreader to open the intercostal gap. The spreader should be placed at the most inferolateral aspect of the incision so that it does not obscure access. This completes the left anterolateral thoracotomy.
  20. If required, extend this left sided incision over onto the right side by make a symmetrical incision on the right chest (clamshell).
  21. Sternotomy is best done after completion of both left and right anterolateral thoracotomies. Divide the sternum using a Lebsche knife and a mallet. You can use a Gigli saw, a bone cutter, plaster cast scissors or even your hands to crack open the sternum if you have to. You need to be making rapid progress.
  22. The internal mammary vessels may not bleed due to hypotension. However, these will bleed postoperatively if not secured. Identify them carefully and tie them securely during the thoracotomy closure.
  23. Lift the sternum and run your fingers gently along the fibrous plane between the posterior surface of sternum and the heart. The sternum is lifted up, like opening the front car bonnet.
  24. A second Finochietto may be applied to the right thoracotomy if available. This completes the clamshell thoracotomy.