Cognitive Biases in Healthcare -Why Your Physician May Be Wrong

Dr Khoo Lee Seng
8 min readApr 15, 2022

Cognitive biases remain a major problem for physicians because these biases profoundly affect one’s capability to collect evidence, interpret evidence, take action and evaluate their decision (Stiegler et al., 2012).

Confirmation bias which is a form of pattern recognition bias would entail the physician selectively gathering and interpretating evidence to suit and affirm one’s beliefs, ignoring or downplaying evidence that contradicts one’s beliefs. A simple example would be the physician refuses to consider alternative diagnosis once the physician zoned in on his or her initial diagnosis even though laboratory investigations may contradict this.

If the diagnosis was made and affirmed by a senior clinician, this may even snowball and be blindly accepted by fellow clinicians leading to inappropriate treatment for an error in diagnosis. This is termed the diagnostic momentum and is an example of social bias.

(Doherty and Carroll, 2020)

Anchoring bias which is a form of stability bias comes to play when we practice prioritizing information and data that supports our initial diagnosis even when those initial impressions are erroneous (Doherty and Carroll, 2020). For example, a general practitioner diagnosing piles (hemorrhoids) without ruling out other more serious pathology that can cause bleeding in stools such as colorectal cancer. Many clinicians do not want to be seen as “stupid” once they have made an initial diagnosis and will do everything to not disprove that their diagnosis is wrong.

Human beings and even physicians can be affected by emotions that influence their actions and decisions instead of thinking and making a decision based on logic. This is termed the affect heuristic. Although heuristics allow mental shortcut approaches to solve problems, applying them subconsciously leads to bias in perception, analysis and decision making. The following was extracted from Henry Marsh which is a clear example of the affect heuristic:

“Surgery is a practical craft and you learn it by doing it. Although a lot of work is being done to develop simulators, there is still no substitute for experience. A very important part of a senior surgeon’s work is to train and supervise the next generation of surgeons. You learn most as a trainee when you are operating on your own and your senior is not standing beside you dictating your every move. There is a serious responsibility, therefore, for the senior surgeon to know when and how much of an operation to delegate. There is an ethical responsibility to the patient in front of you but also an ethical responsibility to your trainee’s future patients. These two demands are not easily compatible, and require careful assessment of the trainee’s abilities.

I delegated the beginning of an operation to a senior trainee whom I liked greatly. By the time I joined him (it is, in fact, standard practice to let the juniors ‘open and close’ neurosurgical operations), he had the patient’s head open, so that I could no longer see exactly where he had made the opening. I assumed it was in the right place but it turned out it was not, and when I opened the meninges there was severe haemorrhage from the saggital sinus, one of the brain’s major veins. The patient died as a result. The ‘halo effect’ — a term coined by Edward Thorndike to describe the tendency for an overall impression to influence the observer’s feelings and thoughts about that person’s character or properties — had distorted my assessment of my trainee’s competence.

I would like to think that I am now better at judging my trainees and knowing when to intervene and when not to — but it has taken me many years and there were other, similar problems (though not quite so disastrous) on the way.” (Marsh, 2016)

We are all way less rational than we like to imagine. Daniel Kahneman in his acclaimed bestseller Thinking, Fast and Slow critical conclusion was that other people are better at seeing our mistakes than we are ourselves (Kahneman, 2011).

Outcome bias is the practice of believing that clinical outcomes — whether good or bad — are absolutely attributable to the clinician’s prior decisions (Doherty and Carroll, 2020).

Even though the relation between decisions and outcomes are related, the outcome of a decision is not the only determinant of its quality. A good outcome can occur despite a poor clinical decision and similarly a bad outcome can occur regardless of perfect planning and execution of treatment plans. We have to accept that medicine is an imperfect art and hence no outcomes can be 100% guaranteed.

A very interesting study in 2016 found that trainee doctors generally believe that black people are not as sensitive to pain as white people and hence were less likely to treat black people’s pain appropriately (Hoffman et al., 2016). While no doctor is in my opinion purposely racist or exhibit explicit bias with regards to race, there are inherent implicit bias shaped by society and social order or norms that act to influence doctors’ minds at least on a unconscious or subconscious level.

A 2012 study showed a correlation between pediatricians’ implicit (unconscious) racial bias in a simulated African-American or white teenager following surgery: “As the strength of provider implicit bias favoring whites increased, the likelihood of prescribing appropriate pain medication decreased only for the black patient.” (Sabin and Greenwald, 2012).

I would like to share my experience of bias in the early phase of my life particularly in the realm of martial arts. I started training traditional martial arts starting with an ancient style of Kung Fu since age 12 and for a long time believed entirely in the system that it was the most complete and fool-proof martial art. I rejected any opinions or “proof” that contradicted my belief in the system which was the Gospel for me. I believed and had faith in the system and in my teacher who told me tales of Kung Fu masters defeating multiple opponents and handling armed attackers with ease (social bias).

A friend of mine who was a Western boxer invited me to spar full-contact one day which I accepted having now trained in traditional Kung Fu for several years. I had the crap beaten out of me and was not used to the boxer using feints, footwork and set-ups. I convinced myself that the problem was me initially and not the system as on an unconscious level, I realized that I now have to start as a beginner again in Western boxing thus I have to lose my rank and standing in traditional Kung Fu (stability bias). I returned to Kung Fu and trained religiously again but even with all this I found that the strategy and techniques of traditional Kung Fu was no match for Western boxing in general in that given all things being equal, the average Western boxer with a few months training will be able to defeat even traditional Kung Fu practitioners with years of experience.

I contemplated for some time as this would mean I have to now face reality (action bias). Eventually, I started training in Western boxing learning the basics and testing myself frequently in sparring sessions which was grounded in reality and not fantasy. It was the beginning of my path to cross train in Brazilian Jujitsu, kickboxing, wrestling and other arts to round up my arsenal.

If I look back, I think I had a lot of fear of confronting reality at that time having invested my time and effort in an art that had limited efficacy. It was between living in fantasy or facing reality which was painful but needed in shedding my cognitive biases. If I had to re-evaluate any new technique or product or business system in medicine; I will like to do it by being grounded in reality and even find evidence that is contrary to my belief. I think the first step towards mitigating biases is to first acknowledge that we all do have biases. The subsequent steps would involve investigations and pressure-testing our ideas or approach. Studying how others approach and solve similar problems can give us input whether we missed something or was operating in a less than ideal way.

Biases in on the operating table may result in suboptimal outcomes for the surgeon and even deaths as explained by famous neurosurgeon Henry Marsh. Biases in business decisions can be catastrophic to the healthcare organization’s cashflow and reputation.

Remember the experiment of Pavlov and the dog? (Rehman et al., 2022).

“Classical conditioning was stumbled upon by accident. Pavlov was conducting research on the digestion of dogs when he noticed that the dogs’ physical reactions to food subtly changed over time. At first, the dogs would only salivate when the food was placed in front of them. However, later they salivated slightly before their food arrived. Pavlov realized that they were salivating at the noises that were consistently present before the food arrived; for example, the sound of a food cart is approaching.

To test his theory, Pavlov set up an experiment in which he rang a bell shortly before presenting food to the dogs. At first, the dogs elicited no response to the bells. However, eventually, the dogs began to salivate at the sound of the bell alone.”

Pavlov’s Classical Conditioning (Mcleod, 2018)

See how easy it is for us to be classically conditioned? How can we break free from this? This will allow us a better understanding of business and employ strategies that work better in healthcare.

Pavlov’s Classical Conditioning

References

Doherty, T.S. and Carroll, A.E. (2020). Believing in Overcoming Cognitive Biases. AMA Journal of Ethics, [online] 22(9), pp.773–778. Available at: https://journalofethics.ama-assn.org/article/believing-overcoming-cognitive-biases/2020-09.

Hoffman, K.M., Trawalter, S., Axt, J.R. and Oliver, M.N. (2016). Racial bias in pain assessment and treatment recommendations, and false beliefs about biological differences between blacks and whites. Proceedings of the National Academy of Sciences, [online] 113(16), pp.4296–4301. Available at: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4843483/.

Kahneman, D. (2011). Thinking, Fast and Slow. New York: Farrar, Straus And Giroux.

Marsh, H. (2016). Do no harm: stories of life, death, and brain surgery. New York: Picador.

Mcleod, S. (2018). Pavlov’s Dogs. [online] Simplypsychology.org. Available at: https://www.simplypsychology.org/pavlov.html.

Rehman, I., Mahabadi, N., Sanvictores, T. and Rehman, C.I. (2022). Classical Conditioning. [online] PubMed. Available at: https://pubmed.ncbi.nlm.nih.gov/29262194/#:~:text=Pavlov [Accessed 2 Apr. 2022].

Sabin, J.A. and Greenwald, A.G. (2012). The influence of implicit bias on treatment recommendations for 4 common pediatric conditions: pain, urinary tract infection, attention deficit hyperactivity disorder, and asthma. American journal of public health, [online] 102(5), pp.988–95. Available at: https://www.ncbi.nlm.nih.gov/pubmed/22420817.

Stiegler, M.P., Neelankavil, J.P., Canales, C. and Dhillon, A. (2012). Cognitive errors detected in anaesthesiology: a literature review and pilot study. British Journal of Anaesthesia, 108(2), pp.229–235

--

--

Dr Khoo Lee Seng

Plastic Surgeon with an interest in the business of healthcare management, medical law, philosophy and humanities.