Posted on

conbias

(This cartoon and nine more similar ones are here).

Our human reasoning and decision-making processes are inherently flawed. Faced with so many decisions to be made every day, we take short-cuts (called heuristics) that help us make “pretty good” decisions with little effort. These “pretty good” decisions are not always right and often compromise and exchange our best decision for one that is just good enough. These heuristics carry with them assumptions which may not be relevant to the individual decision at hand, and if these assumptions are not accurate for the current problem, then a mistake may be made. We call these assumptions “cognitive biases.” Thus,

When a heuristic fails, it is referred to as a cognitive biasCognitive biases, or predispositions to think in a way that leads to failures in judgment, can also be caused by affect and motivation. Prolonged learning in a regular and predictable environment increases the successfulness of heuristics, whereas uncertain and unpredictable environments are a chief cause of heuristic failure (Improving Diagnosis in Healthcare).

More than 40 cognitive biases have been described which specifically affect our reasoning processes in medicine. These biases are more likely to occur with quicker decisions than with slower decisions. The term Dual Process Theory has been used to describe these two distinct ways we make decision. Daniel Kahneman refers to these two processes as System 1 and System 2 thinking.

System 1 thinking is intuitive and largely unintentional; it makes heavy use of heuristics. It is quick and reasoning occurs unconsciously. It is effortless and automatic. It is profoundly influenced by our past experiences, emotions, and memories.

System 2 thinking, on the other hand, is slower and more analytic. System 2 reasoning is conscious and operates with effort and control. It is intentional and rational. It is more influenced by facts, logic, and evidence. System 2 thinking takes work and time, and therefore is too slow to make most of the decisions we need to make in any given day.

A System 1 decision about lunch might be to get a double bacon cheeseburger and a peanut butter milkshake (with onion rings, of course). That was literally the first meal that popped into my head as I started typing, and each of those items resonates with emotional centers in my brain that recall pleasant experiences and pleasant memories. But not everything that resonates is reasonable.

As the System 2 part of my brain takes over, I realize several things: I am overweight and diabetic (certainly won’t help either of those issues); I have to work this afternoon (if I eat that I’ll probably need a nap); etc. You get the idea. My System 2 lunch might be kale with a side of Brussel sprouts. Oh well.

These two ways of thinking actually utilize different parts of our brains; they are distinctly different processes. Because System 1 thinking is so intuitive and so affected by our past experiences, we tend to make most cognitive errors with this type of thought. Failures can occur with System 2 thinking to be sure, and not just due to cognitive biases but also due to logical fallacies, or just bad data; but, overall, System 2 decisions are invariably more correct than System 1 decisions.

We certainly don’t need to overthink every decision. We don’t have enough time to make System 2 decisions about everything that comes our way. Yet, the more we make good System 2 decisions initially, the better our System 1 decisions will become. In other words, we need good heuristics or algorithms, deeply rooted in System 2 cognition, to make the best of our System 1 thoughts. Thus the howardism:

The mind is like a parachute: it works best when properly packed.

The packing is done slowly and purposefully; the cord is pulled automatically and without thinking. If we thoroughly think about where to eat lunch using System 2 thinking, it will have a positive effect on all of our subsequent decisions about lunch.

How does this relate to medicine? We all have cognitive dispositions that may lead us to error.

First, we need to be aware of how we make decisions and how our brains may play tricks on us; a thorough understanding of different cognitive biases can help with this. Second, we need to develop processes or tools that help to de-bias ourselves and/or prevent us from falling into some of the traps that our cognitive biases have laid for us.

Imagine that you are working in a busy ER. A patient presents who tells the triage nurse that she is having right lower quadrant pain; she says that the pain is just like pain she had 6 months ago when she had an ovarian cyst rupture. The triage nurse tells you (the doctor) that she has put the patient in an exam room and that she has pain like her previous ruptured cyst. You laugh, because you have already seen two other women tonight who had ruptured cysts on CT scans. You tell the nurse to go ahead and order a pelvic ultrasound for suspected ovarian cyst before you see her. The ultrasound is performed and reveals a 3.8 cm right ovarian cyst with some evidence of hemorrhage and some free fluid in the pelvis. You quickly examine and talk to the patient, confirm that her suspicious were correct, and send her home with some non-narcotic pain medicine and ask her to follow-up with her gynecologist in the office.

Several hours later, the patient returns, now complaining of more severe pain and bloating. Frustrated and feeling that the patient is upset that she didn’t get narcotics earlier, you immediately consult the gynecologist on-call for evaluation and management of her ovarian cyst. The gynecologist performs a consult and doesn’t believe that there is any evidence of torsion because there is blood flow to the ovary on ultrasound exam. He recommends reassurance and discharge home.

The next day she returns in shock and is thought to have an acute abdomen. She is taken to the OR and discovered to have mesenteric ischemia. She dies post-operatively.

While this example may feel extreme, the mistakes are real and they happen every day.

When the patient told the nurse that her ovary hurt, the nurse was influenced by this framing effect. The patient suffered from triage cueing because of the workflow of the ER. The physician became anchored to the idea of an ovarian cyst early on. He suffered from base-rate neglect when he overestimated the prevalence of painful ovarian cysts. When he thought about his previous patients that night, he committed the gambler’s fallacy and exhibited an availability bias. When the ER doctor decided to get an ultrasound, he was playing the odds or fell victim to Sutton’s slip. When the ultrasound was ordered for “suspected ovarian cyst,” there was diagnosis momentum that transferred to the interpreting radiologist.

When the ultrasound showed an ovarian cyst, the ER physician was affected by confirmation bias. The ER doctor’s frequent over-diagnosis of ovarian cysts was reinforced by feedback sanction. When he stopped looking for other causes of pain because he discovered an ovarian cyst, he had premature closure. When he felt that the patient’s return to the ER was due to her desire for narcotics, the ER doctor made a fundamental attribution error. When he never considered mesenteric ischemia because she did not complain of bloody stools, he exhibited representativeness restraint. When he consulted a gynecologist to treat her cyst rather than explore other possibilities, he was exploited by the sunk costs bias.

Each of these are examples of cognitive biases that affect our reasoning (see definitions below). But what’s another way this story could have played out?

The patient presents to the ER. The nurse tells the doctor that the patient is in an exam room complaining of right lower quadrant pain (she orders no tests or imaging before the patient is evaluated and she uses language that does not make inappropriate inferences). The doctor makes (in his head) a differential diagnosis for a woman with right lower quadrant pain (he does this before talking to the patient). While talking to the patient and performing an exam, he gathers information that he can use to rule out certain things on his differential (or at least decide that they are low probability) and determines the pretest probability for the various diagnoses on his list (this doesn’t have to be precise – for example, he decides that the chance of acute intermittent porphyria is incredibly low and decides not to pursue the diagnosis, at least at first).

After assessing the patient and refining his differential diagnosis, he decides to order some tests that will help him disprove likely and important diagnoses. He is concerned about her nausea and that her pain seems to be out of proportion to the findings on her abdominal exam. He briefly considered mesenteric ischemia but considers it lower probability because she has no risk factors and she has had no bloody stools (he doesn’t exclude it however, because he also realizes that only 16% of patients with mesenteric ischemia present with bloody stools). Her WBC is elevated. He does decide to order a CT scan because he is concerned about appendicitis.

When the CT is not consistent with appendicitis or mesenteric ischemia, he decides to attribute her pain to the ovarian cyst and discharges her home. When the patient returns later with worsened pain, he first reevaluates her carefully and starts out with the assumption that he has likely misdiagnosed her. This time, he notes an absence of bowel sounds, bloating, and increased abdominal pain on exam. He again considers mesenteric ischemia, even though the previous CT scan found no evidence of it, realizing that the negative predictive value of a CT scan for mesenteric ischemia in the absence of a small bowel obstruction is only 95% – meaning that 1 in 20 cases are missed. This time, he consults a general surgeon, who agrees that a more definitive test needs to be performed and a mesenteric angiogram reveals mesenteric ischemia. She is treated with decompression and heparin and makes a full recovery.

These two examples represent an extreme of very poor care to very excellent care. Note that even when excellent care occurred, the rare diagnosis was still initially missed. But the latter physician was not nearly as burdened by cognitive biases as the former physician and the patient is the one who benefits. The latter physician definitely used a lot of System 1 thinking, at least initially, but when it mattered, he slowed down and used System 2 thinking. He also had a thorough understanding of the statistical performance of the tests he ordered and he considered the pre-test and post-test probabilities of the diseases on his differential diagnosis. He is comfortable with uncertainty and he doesn’t think of tests in a binary (positive or negative) sense, but rather as increasing or decreasing the likelihood of the conditions he’s interested in. He used the hypothetico-deductive method of clinical diagnosis, which is rooted in Bayesian inference.

Let’s briefly define the most common cognitive biases which affect clinicians.

  • Aggregate bias: the belief that aggregated data, such as data used to make practice guidelines, don’t apply to individual patients.
    • Example: “My patient is special or different than the ones in the study or guideline.”
    • Consequence: Ordering pap smears or other tests when not indicated in violation of the guideline (which may unintentionally lead to patient harm).
  • Anchoring: the tendency to lock onto the salient features of a diagnosis too early and not modify the theory as new data arrives.
    • Example: “Hypoglycemia with liver inflammation is probably acute fatty liver of pregnancy.”
    • Consequence: Ignoring or rationalizing away the subsequent finding of normal fibrinogen levels (which would tend to go against the diagnosis).
  • Ascertainment bias: this occurs when thinking is shaped by prior expectations, such as stereotyping or gender bias.
    • Example: “She has pain because she is drug-seeking again.”
    • Consequence: Not conducting appropriate work-up of pain.
  • Availability: the tendency to believe things are more common or more likely if they come to mind more easily, usually leading to over-diagnosis (it may also lead to under-diagnosis).
    • Example: “Ooh, I saw this once in training and it was a twin molar pregnancy!”
    • Consequence: Not considering statistically more probable diagnoses.
  • Base-rate neglect: the tendency to ignore the true prevalence of diseases, distorting Bayesian reasoning. May be unintentional or deliberate (for example, when physicians always emphasize the “worst case scenario”).
    • Example: “Its probably GERD but we need to rule out aortic dissection.”
    • Consequence: Ordering unnecessary tests with high false positive rates and poor positive predictive values. 
  • Commission bias: the tendency to action rather than inaction, believing action is necessary to prevent harm. More common in over-confident physicians.
    • Example: “This trick always works in my patients for that problem” or “It’s just a cold, but she made an appointment so she’ll be unhappy if I don’t give her antibiotics” or “I want you to be on strict bedrest since you are having bleeding in the first trimester to prevent a miscarriage.”
    • Consequence: Overuse of potentially risky or unnecessary therapeutics and perhaps guilt-commissioning (if, for example, the patient miscarries when she gets up to tend to her crying baby).
  • Confirmation bias: the tendency to look for supporting evidence to confirm a diagnosis rather than to look for data to disprove a diagnosis.
    • Example: “Aha! That’s what I suspected.”
    • Consequence: Incorrect diagnosis. We should always look for data to disprove our diagnosis (our hypothesis).
  • Diagnosis momentum: the effect of attaching diagnoses too early and making them stick throughout interactions with patient, nurses, consultants, etc. and then biasing others. 
    • Example: “This is probably an ectopic pregnancy” and writing suspected ectopic on the ultrasound requisition form.
    • Consequence: Radiologist reads corpus luteal cyst as ectopic pregnancy.
  • Feedback sanction: diagnostic errors may have no consequence because of a lack of immediate feedback or any feedback at all, particularly in acute care settings where there is no patient follow-up, which reinforces errors in diagnosis or knowledge.
    • Example: “I saw this girl with back pain due to a UTI.”
    • Consequence: Positive reinforcement of diagnostic errors (such as belief that UTIs are a common cause of back pain).
  • Framing effect: how outcomes or contingencies are framed (by family, nurses, residents, or even the patient) influences decision making and diagnostic processes.
    • Example: “The patients says that her ovary has been hurting for a week.”
    • Consequence: Focusing on ovarian or gynecological sources of pelvic pain rather than other more likely causes.
  • Fundamental attribution error: the tendency to blame patients for their illnesses (dispositional causes) rather than circumstances (situational factors).
    • Example: “Her glucose is messed up because she is noncompliant.”
    • Consequence: Ignoring other causes of the condition (e.g. infection leading to elevated glucose).
  • Gambler’s fallacy: the belief that prior unrelated events affect the outcome of the current event (such as a series of coin tosses all showing heads affecting the probability that next coin toss will heads).
    • Example: “My last three diabetics all had shoulder dystocias!!”
    • Consequence: Leads to inappropriate treatment of the current patient, based on facts that are irrelevant. 
  • Gender (racial) bias: the belief that gender or race affects the probability of a disease when no such link exists pathophysiologically.
    • Example: “We need to think about osteoporosis since she’s white.”
    • Consequence: Under- or over-diagnosing diseases. Two-thirds of published racial predilections, for example, in major text books are not supported.
  • Hindsight bias: knowledge of the outcome affects perception of past events and may lead to an illusion of failure or an illusion of control.
    • Example: “Last time I had this, she got better because I gave her ___.”
    • Consequence: Perpetuates error and encourages anecdotal medicine. For example, it is merely an assumption that the intervention affected the outcome, either positively or negatively. 
  • Multiple alternative bias: a multiplicity of diagnostic options leads to conflict and uncertainty and then regression to well-known diagnoses.
    • Example: “Well let’s just focus on what it probably is and not worry about all that other stuff for now.”
    • Consequence: Ignoring other important alternative diagnoses. 
  • Omission bias: the tendency towards inaction, the opposite of a commission bias, and more common than commission biases.
    • Example: “Group B strep infections in neonates are really rare, so I don’t see the point in the antibiotic for this mom.”
    • Consequence: May result in rare but serious harms.
  • Order effects: the tendency to focus on the beginning and the end and fill in the middle part of the story (creating a false narrative or constructing false associations), worsened by tendencies like anchoring. This bias is important to consider in patient hand-offs and presentations.
    • Example: “She had a fever but got better when we treated her for a UTI.”
    • Consequence: Leads to inappropriate causation biases, etc. (The patient got better. This may be due to antibiotics given for a possible UTI or the actual cause of her fever may still be unknown). 
  • Outcome bias: the tendency to pick a diagnosis that leads to good outcomes rather than a bad outcome, a form of a value bias.
    • Example: “I’m sure it’s just a panic attack and not a pulmonary embolism.”
    • Consequence: Missing potentially serious diagnoses.
  • Overconfidence bias: the belief that we know more than we do, leading to a tendency to act on incomplete information, hunches, or intuitions.
    • Example: “I see this all the time, just send her home.”
    • Consequence: Grave harm may occur because of missed diagnosis.
  • Playing the odds: the tendency to opt for more benign diagnoses or simpler courses of action when uncertainty in the diagnosis exists.
    • Example: “I’m sure that this ovarian cyst is benign; it’s gotta be.”
    • Consequence: Potentially devastating when coupled with an omission bias (e.g., not following-up with a repeat ultrasound in a few weeks for a questionable cyst). 
  • Posterior probability error: the tendency to believe that what has gone on before for a patient changes the probability for future events for the patient.
    • Example: “Every time she comes in her bleeding has just been from her vaginal atrophy.”
    • Consequence: Biases current evaluation and work-up (e.g., ignoring post-menopausal bleeding).
  • Premature closure: the tendency to accept a diagnosis before it has actually been confirmed when scant data supports the anchored diagnosis, often leading to treatment failures. 
    • Example: “We know what it is, she just hasn’t responded to treatment yet.”
    • Consequence: Ignoring alternative theories; evidenced by this famous phrase in medicine: When the diagnosis is made, the thinking stops.
  • Psych-out error: this occurs when serious medical problems (e.g., hypoxia, head injuries, etc) are misattributed to psychiatric diagnoses.
    • Example: “She just acts that way because she is bipolar.”
    • Consequence: Ignoring potentially catastrophic physical ailments (e.g, vascular disease or brain tumor). 
  • Search satisfying: the tendency to stop looking once something satisfying is found, both on the patient or in the medical literature (a form of premature closure).
    • Example: “This article says I’m right!” or “That’s where she’s bleeding!”
    • Consequence: Ignoring other causes of symptoms or other contradictory evidence or literature.
  • Sutton’s slip: the tendency to go where the money is, that is, to diagnosis the most obvious things, ignoring less-likely diagnoses.
    • Example: “I rob banks because that’s where the money is,” (Willie Sutton’s response when the judge asked him why he robbed banks).
    • Consequence: Under-utilizing System 2 thinking and ignoring diseases or presentations that are less common.
  • Sunk costs: the more investment that is made in a particular diagnosis or treatment, the less likely one is to release from it and consider alternatives.
    • Example: “I’ve just got to be right, I don’t know why this treatment isn’t working!”
    • Consequence: Further delay in pursuing the right treatment/diagnosis. This also results in a lot of false case reports (We present a case of such-and-such that was refractory to usual treatments but responded to some other crazy treatment or We present a case of such-and-such that presented in some weird nontypical way – in both cases, the diagnosis was likely wrong to begin with). 
  • Triage cueing: the biasing that results from self-triage or systematic triage of patients or presentations, creating tunnel vision.
    • Example: “I need a Gyn consult because she’s a female with pain.”
    • Consequence: Ignoring other organ systems or causes of pain.
  • Unpacking principle: the failure to elicit all relevant information in establishing a differential, particularly when a prototypical presentation leads to anchoring.
    • Example: “Anorexia and right lower quadrant pain is classic appendicitis.”
    • Consequence: Not considering all causes of each symptom individually (or collectively). As an aside, Occam’s razor and other cognitive processes that favor simplicity over complexity are usually wrong but feel comfortable to human imagination (it’s much simpler to blame the MMR vaccine for autism than it is to consider a polygenetic, multifactorial causation theory). 
  • Vertical line failure: this results from routine, repetitive processes that emphasize economy, efficacy, and utility (as opposed to lateral thinking).
    • Example: “I always do a diabetes screen on all pregnant women” or “When I see x I always do y.”
    • Consequence: Deemphasizes lateral thinking (e.g., What else might this be?).
  • Visceral bias: Results from countertransference and other visceral arousal leading to poor decision making.
    • Example: “That patient is just a troll” or “She is so sweet.”
    • Consequence: Leads to cognitive distortion and augments biases.
  • Yin-yang out: the tendency to stop looking or to stop trying once efforts have seemingly been exhausted even though the case hasn’t been satisfied.
    • Example: “We’ve worked her up to the yin-yang.”
    • Consequence: Leads to errors of omission.

I’m sure you can think of many other examples for these biases, and there are many other biases that have been described apart from those on the list. There is an emerging scientific literature which is examining the effects of bias on diagnostic and therapeutic outcomes and on medical error. The 2015 Institute of Medicine Report, Improving Diagnosis in Healthcare, is a good place to start exploring some of the implications of bias in the diagnostic process.

Next we will explore some strategies to mitigate our bias.