Bias is so prevalent, it’s almost scary. The problem is, we don’t see it — and that is exactly how bias works. It is the reason it is so difficult to prevent it (if you don’t try). We are taught to be aware of our biases, maybe we are shown some of them anecdotally throughout our graduate education, but rarely (if ever) do we sit down and discuss these biases. How can we be objective and neutral if we are unaware of the biases that exist?
The scientific method for conducting a research study dictates that we should be aware of our biases, but often those admitted biases seem to be simple (e.g. “The authors declare that they were financially supported by the IASTM company during the study…blah blah). Furthermore, those biases don’t have a lot to do with cognitive or logical thought process errors. Sure, they prevent them, but they aren’t…them. The them, are cognitive bias. This is the actual incorrect thought process that leads to a wrong conclusion.
Cognitive bias goes much deeper and is much more convoluted than simple admission of financial backing. Take for instance one of the more well known cognitive biases, “Confirmation Bias.” Confirmation bias (in the exam setting) is the tendency to look for supportive evidence for your diagnosis, rather than disconfirming evidence to refute your supposed diagnosis. Simple enough, right? Well, you basically just got to the top of the jungle gym, saw the slide you liked the most, and rode down it — concluding it was the best before trying out any of the other slides.
How often do we talk about cutting down exam time? Maybe for time/cost/benefit ratios or because of the skill and prowess of years of clinical practice has led you to deduce a patient’s ultimate diagnosis very early on. Right then, you’ve committed a second cognitive error and form of bias called “Anchoring.”(1)
Anchoring is the tendency to lock on to salient features of a patient presentation early in the diagnostic process. This leads to potential failure to collect all the relevant subjective/objective information. Worse yet, if you don’t attempt to refute your own hypothesis about the patient (i.e. confirmation bias), your differential is even further hampered. Just these two biases together have a severe compounding effect on your diagnostic ability. In the metaphor, you decided which slide was best to ride before you even climbed up the jungle gym.
You may think, “So what’s the big deal? I make a wrong diagnosis here and there, most of them are correct, and PT diagnoses are not that serious anyway!”
Well, first off…NO. If you’ve had that response to any of the above, you should be publicly shamed. Secondly, you are hampering the professional advancement of PT. Direct access means we will have patients with serious diagnoses, of which we might be responsible for identifying. And that sorry “PT diagnosis” of “weak core” as a cause for LBP can get sticky. And when you put a sorry, sticky diagnosis on a patient — everyone loses.
Can you think of a time when you’ve evaluated a patient with a diagnosis (one they’ve had for a while or from both their PCP and podiatrist), say someone with plantar fasciitis, did you stop to fully give your attention to the exam? Did you really try to rule out the likelihood of plantar fasciitis vs. posterior tibialis tendonitis or some other variant of foot pain? “Diagnosis momentum” occurs when a label is attached to a patient. Whether the diagnosis is correct or not, the more often it is “agreed upon” by practitioners, the more difficult it is to remove from the patient’s mind, as well as yours during the exam process. I.e. It gets sticky. You lose, patient loses. You might even lose patients. Just be sure you’ve picked the right slide before you launch off towards the bottom. Once you’ve started, it can be hard to stop.
You see where I’m going here? These errors in thought process are pervasive. If you think you’re immune, you’re not (you probably make more of these errors than those who don’t believe they are superman/superwoman PT).
For those private practice owners: how often do you turn patients away on initial evaluation? Either because you’ve cleared them for any real cause of impairment or because you think they would be better suited seeing another specialist (okay, I understand we cannot refer..just go with it). This is called “Triage Cueing.” It happens when we automatically assume that the person sitting in front of us needs our services — because they were sent to us. I think this is something far too many PTs have been guilty of, perhaps because of pressure to keep revenue up or perhaps because of laziness. Do the patient justice. We are here to provide health care, not injudiciously take people’s money! Just because someone has a 4/5 MMT on hip extension, doesn’t mean they need PT! Take the time to consider all of the information, present your findings to the patient in an understandable way, and make a joint decision on whether to initiate PT treatment or not.
Little bit of a tangent, but you get the idea. Just because a person was referred to you for LBP, it doesn’t mean they need PT to resolve it (RE: Intro and the girl with LBP).
I could go on and on….but I really can’t. So, here’s a list(1,2) of cognitive biases (which is not exhaustive):
(One of my favorites: Gambler’s fallacy — where one makes the assumption that because the first 10 coin flips were heads, the next coin flip will most likely be tails. While an honest, *unbiased* coin has no memory of previous flips and while indeed have 50/50 chance of either heads or tails on each individual flip. Regardless of prior outcomes.)
So, you may be saying to yourself, “I am now thoroughly convinced that I am a biased SOB, but how am I supposed to stop it?!?”
Lucky for you, there are some simple foolproof methods to avoid these cognitive errors.
Okay, they aren’t foolproof, but they are the best thing to decrease the odds of being biased…bro. Here are 7 methods for decreasing bias and cognitive errors in the clinical examination (again not an exhaustive list):
- Awareness and insight
- Consider alternatives
- Metacognition (thinking about your thoughts)
- Decrease reliance on memory
- Minimize time pressures
- Specific training
- Simulation (acting out case studies)
If you’ve made it this far down the post, you’ve already taken action on the first method, “Awareness and insight.” Knowledge is power, and by knowing about these biases you have enabled yourself to spot them when they show up. BUT this is only the start, there are plenty more to learn.
Taking the time to look at each of these strategies, you might find one unifying characteristic is — time. In this day and era of a busy, understaffed health care system — the one thing that we feel we have the least of, may be the one thing we need to make more room for. Take the time to consider alternatives, reflect on your thought processes (whether right or wrong), take some worthwhile continuing education courses, and heck…act it out with your co-workers (we all know that was one of the more hilarious and beneficial parts of our schooling).
Check in next week for the next approach to increasing your scientific reasoning skills in the clinical exam (Part 2).
1) Croskerry P. The Importance of Cognitive Errors in Diagnosis and Strategies to Minimize Them. Academic Medicne. 2003; 78(8):775-780.
2) Muro, S. Scientific Reasoning in the Clinical Examination. Unpublished. 2014.