Thinking Fast and Slow in Medicine: The Cognitive Basis of Errors and Tools for Prevention

by Joyce A. Wahr, MD, FAHA

Thinking Fast and SlowWhen surveyed, nearly all (85%) of anesthesiologists acknowledge committing at least one medication error.1 Clearly, the vast majority of these errors are of little consequence, but some, such as the recent spate of ampule swaps of tranexamic acid (TXA) for bupivacaine, can be deadly.2 Often, the difference between “of little consequence” and “lethal” is pure luck—your syringe swap was vecuronium for neostigmine (a relatively common syringe swap) rather than vincristine for methotrexate or heparin 10,000 Units per mL for heparin flush.3 When such a syringe swap occurs and a patient is harmed, reviewers and even the clinician involved are often perplexed as to how such an error could have been made. The intent of this article is to discuss some of the known cognitive processes that can lead to such an error.

SYSTEM 1 VS. SYSTEM 2 THINKING

The science of cognition—how we think—has been around for some time. The knowledge that humans think and act unconsciously and consciously and that these modes of thinking are related to specific errors has been described previously by James Reason,4 but a deeper understanding has come through the work of Amos Twersky and Daniel Kahneman over a collaboration of about 15 years beginning in 1970.5 This work in what Kahneman calls “bounded rationality” earned him the 2002 Nobel Prize in Economics, an award he would have shared with Twersky had the latter not died at a young age.6 In his summative book, Thinking, Fast and Slow, Kahneman delves deeply into what he terms System 1 and System 2 thinking.5 System 1 is the incredibly fast, unconscious, effortless and automatic process by which humans perceive the ever-changing world around them, fit these perceptions into mental models and then, again, unconsciously and effortlessly, determine how to act. When driving home from work, for example, you are not conscious that your System 1 has recognized the gas station on the left and determined that a right turn is required to continue home.

System 1 quickly and effortlessly supplies the answer to 2 + 2 or 2 x 2 (a mental model exists), but System 1 cannot supply the answer to 27 x 14 (no prior mental model). For that calculation System 2 is required: an effortful, slow, deliberate, and conscious process that works throught principles of multiplication to achieve the answer. Humans flit between these two systems of thinking throughout the day, always preferring to have System 1 perceiving and acting, but pulling in System 2 when System 1 does not have a mental model that fits the current situation. We are endlessly creating new System 1 mental models—every time we pick up a new hobby or learn a new skill (e.g., placing an arterial line) we begin with a System 2 process that effortfully lays out the steps. With repetition, this skill moves into what James Reason calls a schema, a mental construct of the sequence of tasks to be done to reach a goal.

HOW SYSTEM 1 THINKING LEADS TO ERROR

Humans strongly prefer to work in System 1—effortless, unconscious, automatic—and this preference leads to errors. Evaluating an unusual presentation with System 2 requires effort; as humans are averse to effort, the subconscious mental model that quickly comes to mind is chosen. Characteristics of the current situation that do not fit the chosen mental model may be discarded or discounted. System 1 can surreptitiously override System 2. It was recognition of the fact that humans make wrong choices even when the facts are known that initiated Kahneman and Twersky’s work. One famous example is this simple problem:

  • A ball and bat together cost $1.10
  • The bat costs $1 more than the ball
  • What does the ball cost?

The answer that instantly and effortlessly comes to mind is the ball costs 10 cents, even when a very simply calculation provides the answer that the ball must cost 5 cents. Even when System 2 can easily and consciously do the math, System 1 chooses the easiest and “most available” answer. Another example of System 1 overriding System 2 is shown in Figure 1a and 1b. If you cover 1a, it is clear that the two horizontal lines are of equal length—but when you cover 1b, System 1 simply cannot accept that the two are of equal length.

Figure 1A and B: Which horizontal line is longer? An example of System 1 overiding System 2 thinking.

Figure 1A and B: Which horizontal line is longer? An example of System 1 overiding System 2 thinking.

These two concepts are only the first two chapters of Thinking Fast and Slow; there are many other situations in which System 1 surreptitiously subverts our rational System 2. Cognitive biases abound in System 1 and mislead us frequently.6 These two examples, however, provide enough evidence to explain many of our errors.

COGNITIVE ERRORS AND MEDICATION SAFETY

The APSF Newsletter has described in detail the recent series of ampule and vial swaps in cesarean deliveries, where an ampule of TXA is erroneously drawn up and injected into the cerebral spinal fluid.7 Most of us would believe that we would not make such an error, but a quick glance at the “look alike” ampules and vials that were swapped should give us pause (Figure 2). The retina, optic nerve, and optical cortex may correctly read the ampule as tranexamic acid, but System 1 is running a mental schema of “spinal anesthesia,” so the ampule MUST be bupivicaine; that is what System 1 reports and acts on. Just as in Figures 1a and 1b, System 1 cannot NOT see what it expects to see based on the mental model being enacted.

Figure 2: An example of look-alike vials, courtesy of the APSF look-alike vial gallery. <a href="https://www.apsf.org/look-alike-drugs/">https://www.apsf.org/look-alike-drugs/</a>.

Figure 2: An example of look-alike vials, courtesy of the APSF look-alike vial gallery. https://www.apsf.org/look-alike-drugs/.

What can we possibly do to avoid errors, given that System 1 is unconscious? The answer is simple—create a fail-safe process that System 1 cannot subvert. Provide TXA to the anesthesia professional in an infusion bag, never in an ampule.7 We do not have a mental model whereby we infuse infusion bags into the cerebrospinal fluid. A further step would be to have the pharmacy only supply bupivacaine in prefilled NRFit syringes that can only couple with a NRFit needle. Other fail-safe interventions include barcode medication administration, which employs both visual and audible presentation of the medication; using two senses provides two chances to catch an error. A less expensive, but effective approach is that the circulating nurse is the only one authorized to pull TXA from the dispensing cabinet, and the process includes prohibition of supplying the TXA until after the spinal or epidural is completed.

Unfortunately, most forcing functions or failsafe processes cost more and are much harder to implement than an exhortation to “try harder” (Figure 3). In addition, as anesthesia professionals, we often believe that we are each “better than average,” that we do not need prefilled syringes, pharmacy-supplied medications, or barcoded medication administration systems in the OR. If we could truly “be careful,” i.e., use System 2 to monitor our actions at every step of the subconscious scheme, perhaps we could be error free. But, System 2 is effortful. If one is on a hike and then asked to supply the answer to 27 x 14, one would simply stop hiking, as we have a limited reservoir of effort; physical, emotional, and mental efforts all pull from the same reserve. One simply cannot continually expend the mental effort to use System 2 for every task. Fortunately, most fail-safe or forcing functions to reduce medication errors, while costing something, are not prohibitively expensive. Human factors engineers and medication safety experts have told us for many years that interventions that rely solely on human effort are ineffective.

We as a profession must accept that we are not infallible, that System 1 is the elephant and System 2 is the rider—mere effort will not keep the elephant on the right path. We need to demand that our hospitals provide us tools that go well beyond “try harder.”

 

Joyce Wahr, MD, is professor emeritus at the University of Minnesota Medical School, Minneapolis, MN.


Joyce Wahr, MD, receives royalties from publication of her book, Medication Safety in Anesthesia and the Perioperative Period.


REFERENCES

  1. Orser BA, Chen RJ, Yee DA. Medication errors in anesthetic practice: a survey of 687 practitioners. Can J Anaesth. 2001;48:139–146. PMID: 11220422.
  2. Veisi F, Salimi S, Mohseni G, et al. Accidental intrathecal injection of tranexamic acid in cesarean section: a fatal medication error. Case report. APSF Newsletter. 2010;25:9. https://www.apsf.org/article/accidental-intrathecal-injection-of-tranexamic-acid-in-cesarean-section-a-fatal-medication-error/ Accessed March 28, 2025.
  3. Arimura J, Poole RL, Jeng M, et al. Neonatal heparin overdose—a multidisciplinary team approach to medication error prevention. J Pediatr Pharmacol Ther. 2008;13:96–98. PMID: 23055872.
  4. Reason J. Human error. Cambridge University Press; 1990.https://doi.org/10.1017/CBO9781139062367.
  5. Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011:499.
  6. Stiegler MP, Tung A. Cognitive processes in anesthesiology decision making. Anesthesiology. 2014;120:204–217. PMID: 24212195.
  7. Lefebvre PA, Meyer P, Lindsey A, et al. Unraveling a recurrent wrong drug-wrong route error—tranexamic acid in place of bupivacaine: a multistakeholder approach to addressing this important patient safety issue. APSF Newsletter. 2024;39:37–41. https://www.apsf.org/article/unraveling-a-recurrent-wrong-drug-wrong-route-error-tranexamic-acid-in-place-of-bupivacaine/ Accessed March 23, 2025.

Comments

  1. YP says:

    In an emergency i once gave 500 ml of heparin instead of 500 of hespan. Heparin was stocked in the pyxis drawer in error and the barcode scanner disabled bc it was the OR pyxis. Obviously I felt horrible.

  2. Jayesh says:

    I am surprised its only 85%. It should be 100% if you have had a long enough career!

Add a Comment

All comments are manually moderated. Please see our Commenting Guidelines.
Your email address will not be published and will only be used for commenting notifications if you subscribe.

Continue Reading