Author: Dr. Dave Hill
Thinking about a recent M&M case and the roles that bias plays in how patient care is delivered, I don’t think I could say it better than the Drs. at EM Ottawa (emottawablog.com).
“Clinical decision making is an extremely complex process, and healthcare professionals often develop adaptive mechanisms [heuristics or biases] as we are faced with repeated similar experiences in a busy clinical environment.”
They go on to say that,
“one of the best ways we can combat these decision-making errors is to first explicitly be made aware of these biases.”
These biases are exquisitely laid out by Drs. Campbell, Croskerry and Bond in their 2008 publication Profiles in Patient Safety: A Perfect Storm in the Emergency Department. (Table 1 Below.) Reviewing these may help us to apply these heuristics in a more appropriate fashion to improve patient care and throughput, while prevent us from falling prey the errors in thinking to which they can lead.
Table 1: Classification Scheme for Cognitive Dispositions to Respond (CDRs)
Errors of over-attachment to a particular diagnosis
- Anchoring: the tendency to perceptually lock on to salient features in the patient’s initial presentation too early in the diagnostic process and failing to adjust this initial impression in the light of later information. This CDR might be severely compounded by the Confirmation Bias.
- Confirmation bias: the tendency to look for confirming evidence to support a diagnosis rather than look for disconfirming evidence to refute it, despite the latter being more persuasive and definitive.
- Premature closure: a powerful CDR accounting for a high proportion of missed diagnoses. It is the tendency to apply premature closure to the decision-making process, accepting a diagnosis before it has been fully verified. The consequences of the bias are reflected in the maxim: “when a diagnosis is made, the thinking stops.”
Errors due to failure to consider alternative diagnoses
- Multiple alternative bias: a multiplicity of options on a differential diagnosis might lead to significant conflict and uncertainty. The process might be simplified by reverting to a smaller subset with which the physician is familiar but might result in inadequate consideration of other possibilities. One such strategy is the 3-diagnosis differential: “it is probably A, but it might be B, or I don’t know (C)”. Although this approach has some heuristic value, if the disease calls in the C category and is not pursued adequately, it minimized the change that serious diagnoses are made.
- Representativeness bias: drive the diagnostician toward looking for prototypical manifestations of disease: “if it looks like a duck, walks like a duck, quacks like a duck, then it is a duck.” Yet, restraining decision making along these pattern recognition lines leads to atypical variants being missed.
- Search satisficing: reflects the universal tendency to call off a search once something is found. Co-morbidities, second foreign bodies, other fractures, and co-ingestants in poisoning may all be missed.
Errors due to inheriting someone else’s thinking
- Diagnostic momentum: once diagnostic labels are attached to patients, they tend to become stickier and stickier. Through intermediaries (patients, paramedics, nurses, physicians) what might have started as a possibility gathers increasing momentum until it becomes definite, and other possibilities are excluded.
- Framing effect: how diagnosticians see things might be strongly influenced by the way in which the problem is framed, e.g. physicians’ perceptions of risk to the patient may be strongly influenced by whether the outcome is expressed in terms of the possibility that the patient might die or might live. In terms of diagnosis, physicians should be aware of how patients, nurses, and other physicians frame potential outcomes and contingencies to the clinical problem to them.
- Bandwagon effect: the tendency for people to believe and do certain things because many others are doing so. Groupthink is an example, and it can have a disastrous impact on team decision making and patient care.
Errors in prevalence perception or estimation
- Availability bias: the disposition to judge things as being more likely, or frequently occurring, if they readily come to mind. Thus, recent experience with a disease might inflate the likelihood of its being diagnosed. Conversely, if a disease has not been seen for a long time (is less available), it might be underdiagnosed.
- Base-Rate neglect: the tendency to ignore the true prevalence of a disease, either inflating or reducing its base-rate, and distorting Bayesian reasoning. However, in some cases clinicians might (consciously or otherwise) deliberately inflate the likelihood of disease, such as in the strategy of “rule out worst-case scenario” to avoid missing a rare but significant diagnosis.
- Hindsight bias: knowing the outcome might profoundly influence perception of past events and prevent a realistic appraisal of what actually occurred. In the context of diagnostic error, it may compromise learning through either an underestimation (illusion of failure) or overestimation (illusion of control) of the decision maker’s abilities.
Errors involving patient characteristics or presentation context
- Fundamental attribution error: the tendency to be judgmental and blame patients for their illness (dispositional causes) rather than examine the circumstances (situational factors) that might have been responsible. In particular, psychiatric patients, minorities, and other marginalized groups tend to suffer from this CDR. Cultural differences exist in terms of the respective weights attributed to dispositional and situational causes.
- Triage cueing: the triage process occurs throughout the healthcare system, from the self-triage of patients to the selection of a specialist by the referring physician. Many CDRs are initiated at triage, leading to the maxim: “geography is destiny.” Once a patient is referred to a specific discipline, the bass within that discipline to look at the patient only from their own perspective is referred to as “deformation professionnelle”.
- Ying-yang out: when patients have been subjected to exhaustive and unavailing diagnostic investigations, they are said to have been worked up the yin-yang. The yin-yang out is the tendency to believe that nothing further can be done to throw light on the dark place where, and if, any definitive diagnosis resides for the patient, i.e. the physician is let out of further diagnostic effort. This might prove ultimately to the true, but to adopt the strategy at the outset is fraught with the change of a variety of errors.
- Campbell, S. G., Croskerry, P. and Bond, W. F. (2007), Profiles in Patient Safety: A “Perfect Storm” in the Emergency Department. Academic Emergency Medicine, 14: 743-749. doi:10.1197/j.aem.2007.04.011
- Featured image from: https://www.vision.org/critical-thinking-checking-the-bias-2982, accessed November 15 2019