Circulation 60,475 • Volume 15, No. 3 • Fall 2000

Human Factors Consultant Rebuts PCA Pump Danger Charges

Anthony D. Andre, PhD

In My Opinion:

In Response…

While conducting human factors research across a variety of domains for the past 11 years, I have been involved over the past 7 years in the research and actual design of many consumer products, including more than 10 different medical devices as the founding principal of a human factors consulting firm. Like my friend and colleague, Dr. Vicente, I have devoted most of my career to the observation, study and design of human-machine interfaces in an attempt to improve performance, safety and ease of use.

After reviewing the recent literature on the PCA pump1,2,3, culminating with Dr. Vicente’s letter4, I felt compelled to point out the flawed logic behind the proposed explanation of, and associated solution to, the reported accidents and incidents, as well as the lack of a true human factors systems approach to the reduction of programming errors. At the request of Abbott Labs, I am making public my opinions on this matter in the form of this letter.

Proposed Fix Flawed

While I disagree with Dr. Vicente’s approach and suggested solution to the documented accidents involving the PCA pump, I do agree with the majority of his premises about the true nature of human-machine interaction. These are that:

1) “there is no simple correspondence between overt behavior and mental processes.”

2) “the fact that someone presses a button does not ensure anything about what they have thought about or looked at.”

3) “well-intentioned, well-educated and attentive individuals inevitably sometimes make mistakes.”

In fact, it’s because of these same established principles of human-machine interaction that I am convinced that the proposed design “fix” is an inappropriate explanation of past errors and an ineffective solution to future errors. To understand this argument, recall that Vicente provides an example where a nurse “mistakenly accepts the initially displayed drug concentration of 0.1 mg/ml instead of changing the concentration setting to a correct value” (such as 1.0). In the context of the proposed interface design solution1,2,4, let’s assume that the unit initially presents a high concentration value of 10.0 (or a value of 0.0), ensuring that the initial value is higher than any vial concentration that might be considered for this application. The user must next press the “down” button several times (or hold it down) until the correct value is reached.

Yet, for the same reasons that Dr. Vicente argues that it is understandable for a nurse to mistakenly accept the initially displayed drug concentration instead of changing it to the correct value, it is equally likely that this same nurse might not enter the correct value when required to do so (e.g., selecting 0.5 instead 5.0). Indeed, the ECRI report1 details two cases of PCA programming errors where a “clinician entered a drug concentration that was a factor of 10 lower than the actual concentration of the administered drug.” In other words, the likely cause of the reported incidents is not a blind acceptance of the initially-presented concentration value, but rather that the operator transposed the numbers; a common error in the medical domain.5 Given this scenario, and the aforementioned assumptions about human-machine interaction, one can only conclude that errors will continue to occur since the Lifecare 4100 (and other systems like it) ultimately requires the user to manually input a concentration value that matches exactly the concentration of the vial being placed in the device.

Clearly, then, to ask the manufacturer to design away this complicated error is to request an almost impossible task. Simply stated, the PCA device requires the user to correctly enter the exact concentration of the drug they have chosen to administer to the patient, regardless of the initial value presented by the device (whether 0, 0.1 mg, 5 mg, 10 mg or any other number). And, given the premises of Dr. Vicente, that “the fact that someone presses a button does not ensure anything about what they have thought about or looked at,” it is inevitable that errors (both over- and under-doses) in programming the correct concentration can and will still occur. In fact, a recent study of infusion pump interfaces found that incorrect numerical values were occasionally entered by the test users. The researchers concluded that “this is a serious mistake which can not be solved by the design of the interface.” (6, p. 131).

I argue that Vicente’s approach, and therefore the suggested solution forwarded by him and his colleagues, is flawed in two fundamental respects. First, they have taken a specific scenario, changed the interface and then assumed that the user would carry out the exact same actions with different, safer results. But they neglected to consider the fact that when any element of an interface is changed, it is often difficult, if not impossible, to foresee the results of quite minor changes because of the tendency for changes to propagate their effects throughout the system and to cause unforeseen interactions7. I have already demonstrated that changing the initially-presented concentration value will not necessarily lower error rates or their consequences. But more critical to discuss is the proposed solution in the context of the commonly-used default configuration of the device; a solution that is likely to produce a much larger number of programming errors.

Recall that the default configuration of the device, when shipped from the manufacturer, presents a series of four drug/concentrations settings to the user, followed by an option to input any specific concentration value. The suggestion is that the order of drug concentration values initially presented to the user should be reversed, so that, for example, the higher concentration value (5 mg Morphine) is presented before the lower concentration value (1 mg morphine). This would be in response to an overdose cases where the user erroneously accepted the initially-presented value, but was actually using a higher concentration. If we consider only the (relatively) few cases where this error, or accident, occurred, then this design change appears to be an effective solution.

However, what has not been equally considered is the much larger number of cases where the nurse does NOT misprogram the device. It is a statistical fact that the overwhelming majority of times (nearly every time), the nurse who is holding a 5 mg concentration of morphine correctly declines the first setting (1 mg) and accepts the second setting (5 mg). Now imagine that in response to Vicente and his colleagues, the order of presentation is suddenly changed so that the high concentration (5 mg) is presented before the low concentration (1 mg). What will be the result? A condition we call “negative transfer”—where a well-practiced, automatic appropriate response now results in the opposite effect, an inappropriate response.8 Stated directly, instead of eliminating the rare (and sometimes tragic) error that results from users mistakenly accepting the initially-presented (low) value in the present PCA design, the proposed design will actually cause a huge increase in errors that result from users exercising an over-learned, previously-appropriate, response—that is, accepting the second (low) value. It is interesting to note that Lin9 recently reported that some errors found with both the existing and a newly designed PCA interface are attributed to the prior experience of PCA users. Further, in their previous study of PCA interfaces, Lin et al.10 argued that subjects “who have been exposed to the old interface acquire behaviors that do not allow them to fully exploit the benefits of the new interface” (p. 740). Thus, while I agree with Vicente that the documented programming errors were both “counterintuitive” and “unforeseen,” I believe that the errors that will likely result from this proposed design solution are both “intuitive” and “foreseeable.”

The second fundamental flaw in Vicente’s approach is that he has focused only on the product design as the sole cure for programming errors, which ironically, violates the well-accepted systems approach to human factors engineering11. As Moray11 points out:

“to concentrate too closely on the details of a particular error often leads to locking the stable door after the horse is stolen. To do so ensures that if there is another similar horse and a similar thief, we will be able to present the loss of the second horse. But in fact, in large, complex systems such as health-care delivery, there is an infinite number of horses and an infinite number of thieves. When any horse goes missing, we should consider not merely locking doors, but rebuilding stables, retraining personnel—or even keeping animals other than horses. The systems point of view emphasizes versatility in searching for solutions and shows that focusing on ever-tighter local constraints will simply leave the system increasingly vulnerable to unforeseen events.” (pp. 89-90)

The Need for a Systems Approach

Leape, in his article on error in medicine12, states that “the most fundamental change that will be needed if hospitals are to make meaningful progress in error reduction is a cultural one” (p.1857). My career in the field of human factors has been centered on designing user interfaces that increase safety, efficiency and ease of use. And I’m sure that the Lifecare 4100, like most all other medical devices, can benefit from an improved user interface. Yet, I feel that the product-focused approach taken by Vicente and his colleagues4,9,10 might take the medical community a step backwards, moving them further away from making the necessary system-wide changes needed to minimize PCA programming errors and their consequences. Such an approach has been widely advocated throughout the human factors11 and medical communities as evidenced by a recent conference emphasizing a multidisciplinary approach to errors in health care14. It is also worthwhile to note that only one of the six recommendations of the ECRI hazard report relates to the design of the user interface.1

We in the human factors community have long lobbied against the practice of assuming that errors are the fault of the human who commits them. However, Vicente has now positioned himself on the other end of the continuum, assuming that the fault lies solely with the manufacturer of the equipment. It is my hope that researchers, medical professionals, manufacturers and designers soon work together, in the context of a multi-dimensional, systems design approach11, that will effectively minimize all sources of errors in the use of PCA and other infusion pumps.

Dr. Andre is the Founding Principal of Interface Analysis Associates, a human factors and ergonomics consulting firm in San Jose, CA. He is also an Adjunct Professor of Human Factors and Ergonomics, San Jose State University, San Jose, CA.

References

1. ECRI. Abbott PCA Plus II patient-controlled analgesia pumps prone to misprogramming resulting in narcotic overinfusions. Health Devices 1997; 26(10): 389-391.

2. Institute for Safe Medication Practices. Evidence builds: lack of focus on human factors allows error-prone devices. ISMP Medication Safety Alert 1999; 4(15).

3. St. John P. Drug pump’s deadly trail. Tallahassee Democrat, Sunday , May 28, 2000.

4. Vicente KJ. Human factors researcher alarmed by deaths during PCA. APSF Newletter, 2000; 15:36.

5. Senders JW. Medical devices, medical errors, and medical accidents. In M.S Bogner (ed.), Human error in medicine, Lawrence Erlbaum Associates, 1994: 159-178.

6. Garmer K, Lijegren E, Osvalder AL, Dahlman S. Usability evaluation of a new user interface for an infusion pump with a human factors approach. In Proceedings of the IEA 2000/HFES 2000 Congress 2000: 1; 128-131.

7. Ashby WR. An introduction to cybernetics. London: Chapman and Hall, 1956.

8. Wickens CD. Engineering psychology and human performance. New York: Harper Collins, 1992.

9. Lin L. Human error in patient-controlled analgesia: Incident reports and experimental evaluation. In Proceedings of the HFES 42nd Annual Meeting 1998: 1043-1047.

10. Lin L, Isla R, Doniz K, Harkness H, Vicente K, Doyle D.J. Analysis, redesign and evaluation of a patient-controlled analgesia machine interface. In Proceedings of the HFES 39th Annual Meeting 1995: 738-741.

11. Moray NP. Error reduction as a systems problem. In M.S Bogner (ed.), Human error in medicine, Lawrence Erlbaum Associates, 1994: 67-91.

12. Leape LL. Error in medicine. JAMA 1994; 272; 1851.

13. No mistake about it: Speaking up saves lives. Nurseweek, January, 1997. Available URL: www.nurse-week.com/features/97-1/error2.html