National Rounds | Diagnostic Reasoning: Should we trust our gut?

In Education & Quality Improvement, National Rounds by Jonathan Sherbino3 Comments

5On May 24th, 2016, Dr. Jonathan Sherbino (@sherbino) of McMaster University was invited to speak at Grand Rounds at the University of Saskatchewan on the topic of diagnostic reasoning. His presentation explained how physicians think of a diagnosis and how we can teach learners cognitive strategies to improve their diagnostic reasoning. This blog post has taken that wisdom and (hopefully) captured it in blog post form as the first blog edition of CanadiEM National Rounds.

Misdiagnosis… The Boogieman of the ED

No one wants a misdiagnosis. It’s harmful to the patient, it costs the system money, and can take a mental toll on the physician responsible. Reflecting on diagnostic reasoning has been a focus for many medical educators to improve their own ability, effectively teach their learners, and avoid misdiagnosing.

The leading psychological theory on how physicians think is the dual processing model made up of two systems. System 1 is our instinctive thought process, and System 2 is our slow and methodical thought process.

System 1System 2
• Fast• Slow
• Intuitive• Rational
• Inductive• Deductive
• Acquired through experience• Analytical
• Unconscious• Conscious

Working in parallel, System 1 and System 2 solve problems.

When examining where diagnostic error comes from, it has been assumed that the likely culprit is our fast “System 1” thinking. If it’s fast, it’s more reckless and prone to errors….right? Is our gut-instinct, based off years of experience, actually the source of our diagnostic errors?

Can we trust our gut, or are we too biased?

According to the creators of the model, Daniel Kahnemann and Amos Tversky, System 1 thinking is the source of most errors.

“…errors of intuitive judgment involve failures of both systems: System 1, which generated the error, and System 2, which failed to detect and correct it.” 1

Thus started a proverbial witch hunt in medical education to mitigate the role of “error-producing” System 1 thinking and encouraging “safer” System 2 thinking. But is this a mistake? The conclusion that System 1 is the source of error is based on general problem solving (e.g. Do more words start with the letter K or L?) which is not the same as the technical and professional knowledge-based problem solving required for medicine. How does the role of System 1 thinking in a medical context affect our dual process thinking?

Diagnostic Reasoning in Action

Let’s examine a problem:

Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student she was deeply concerned with issues of discrimination and social justice and also participated in antinuclear demonstrations. 1

Which is the most likely statement?

  1. Linda is a bank teller
  2. Linda is a bank teller and active in the feminist movement

While Linda’s interests might suggest that she would be active in the feminist movement, there are a lot more bank tellers in the world than bank tellers who are active in the feminist movement. As a result, the correct answer is A: the probability of one statement being true is always higher than the probability of two statements. (PaPb < Pa or Pb). While this is a clever demonstration, it isn’t a medical question and doesn’t require the use of technical knowledge.

Let’s try that again…

Rahim is a 55 year old male who presents to the emergency department with multiple injuries following a car accident. On examination he has diminished breath sounds on the left side and a tender abdomen. His blood pressure is 90/55 and his pulse is 135 beats per minute.

Which is the most likely statement?

  1. Rahim has a pneumothorax
  2. Rahim has a pneumothorax and a ruptured spleen

The mathematically most likely statement is A. But then again, given the tender left side of the abdomen and low blood pressure, it also sounds, looks, and smells like the patient has a high likelihood of having splenic laceration. I know I want the physician taking care of me to rule out a ruptured spleen! So in a medical context, B is the answer a physician should choose.

Our gut instinct is vital to our diagnostic reasoning skills

The version of the dual processing model that makes System 1 thinking the villain does not hold in a medical context. Maybe System 1 isn’t the bad guy and it’s actually giving us rapid access to past experience. So where do our misdiagnoses come from if not from our gut? 

5 Myths of Misdiagnosis Debunked

Diagnostic reasoning

1. Bias

Insinuating that diagnostic error comes from our cognitive biases may be influenced by hindsight bias. It’s been reported that ⅔ of all medical mistakes in the Emergency Room are based off premature closure bias.2 But when physicians were given a case to diagnose and were told they had made an incorrect diagnosis; physicians blamed their cognitive biases for their mistake.3 Cognitive biases may only exist because hindsight is 20/20.

2. Speed & Accuracy

If we go quick, we create more errors….or do we?

Diagnostic reasoning

In fact, faster response times to diagnosis has actually been correlated with increased or maintained diagnostic accuracy.4,5 When you know an answer, you often know it immediately. When you don’t, you can take longer to think and still not come up with the correct answer.

Going slow just makes you slower, not better.

tortoisevshare

3. Interruptions

It’s hard to concentrate on our STEMI patient when there are overhead pages, nurses needing your attention, and all the other distracting sights and sounds in the busy ED. It makes sense that the more we are interrupted in our thinking, the more mistakes we will make, right? A study compared diagnostic accuracy6 with interruptions and found:

Residents’ diagnostic accuracy:

  • Interruptions = 43%
  • No interruptions = 44%

Emergency Physicians diagnostic accuracy:

  • Interruptions = 71%
  • No Interruptions = 71%

Despite the pages and beeping monitors, residents and physicians were not hindered by the interruptions. 

4. Experience

Not surprisingly, the more experience you have, the fewer mistakes you make. Experienced Attendings have more accurate diagnostic reasoning skills than Residents, and Residents more than Medical Students (but that’s no surprise to anyone). A focus for medical educators should be to ensure their learners are getting hands on experience.

So what’s a teacher to do?

Thomas Eakins Agnew Clinic

Thomas Eakins Agnew Clinic

5. Cognitive Forcing Strategies

The current belief in medical education is that emphasizing cognitive forcing strategies will improve diagnostic accuracy via System 2 thinking and decrease “error-producing” System 1 thinking. However, cognitive forcing strategies are not going to enhance diagnostic reasoning skills because they don’t add what really matters: experience. It was found that teaching cognitive forcing strategies to reduce biases in medical students had no effect on diagnostic accuracy. 7 

When medical residents were asked a series of diagnostic questions and were given a chance to take a second look at it with non-structured reflection, no changes were found in diagnostic accuracy. Their first impression was no better than taking their time reflecting the diagnosis.8 First Impressions last. Either you know it, or you don’t. Simple as that.

But when the residents used Structured Reflection such as this:

  1. Write down most likely diagnosis
  2. Write down alternative diagnoses
  3. List findings
  4. Rank diagnoses in order of likelihood

There was a ~10% impact on improving diagnostic accuracy for complex cases, but only among experienced clinicians.9 Using structured reflection does not help the junior learner, it just makes them slower!

So now what?

If cognitive forcing strategies have no evidence that they help our learners… what should we teach them then? Perhaps, rather than forcing ineffective cognitive debiasing strategies on our junior learners, the goal of medical educators should be to focus should be on the two known markers of diagnostic accuracy: knowledge and experience. After all, practice makes perfect.

If experience is the “Golden Ticket” to better clinicians; what can the teachers do? Real cases, real patient encounters, and for those rare thoracotomies; realistic simulations.

Conclusion

When it comes to a diagnosis, your gut might be right. Teach your learners and give them more experience in the ED.

What to take away:

  • Hindsight bias can mislead us into overestimating the rate of cognitive bias.
  • Going slow makes you slow
  • Structured Reflection Strategies do not help the Junior Learner but may help the experienced clinician
  • There is no evidence that cognitive forcing strategies help in the ED
  • Ensure your learners gain experience in the ED!!!

This post was written by Jesse Leontowicz (@jleontow) based on a lecture by Dr. Jonathan Sherbino (@sherbino). Dr. Sherbino also reviewed to post prior to publication.

References

1.
Kahneman D. Thinking, Fast and Slow. Farrar, Straus and Giroux; 2011.
2.
Graber M. The incidence of diagnostic error in medicine. BMJ Qual Saf. 2013;22 Suppl 2:ii21-ii27.[PubMed]
3.
Zwaan L, Monteiro S, Sherbino J, Ilgen J, Howey B, Norman G. Is bias in the eye of the beholder? A vignette study to assess recognition of cognitive biases in clinical case workups. BMJ Qual Saf. January 2016.[PubMed]
4.
Sherbino J, Dore K, Wood T, et al. The relationship between response time and diagnostic accuracy. Acad Med. 2012;87(6):785-791.[PubMed]
5.
Norman G, Sherbino J, Dore K, et al. The etiology of diagnostic errors: a controlled trial of system 1 versus system 2 reasoning. Acad Med. 2014;89(2):277-284.[PubMed]
6.
Monteiro S, Sherbino J, Ilgen J, et al. Disrupting diagnostic reasoning: do interruptions, instructions, and experience affect the diagnostic accuracy and response time of residents and emergency physicians? Acad Med. 2015;90(4):511-517.[PubMed]
7.
Sherbino J, Yip S, Dore K, Siu E, Norman G. The effectiveness of cognitive forcing strategies to decrease diagnostic error: an exploratory study. Teach Learn Med. 2011;23(1):78-84.[PubMed]
8.
Monteiro S, Sherbino J, Patel A, Mazzetti I, Norman G, Howey E. Reflecting on Diagnostic Errors: Taking a Second Look is Not Enough. J Gen Intern Med. 2015;30(9):1270-1274.[PubMed]
9.
Mamede S, van G, van den, et al. Effect of availability bias and reflective reasoning on diagnostic accuracy among internal medicine residents. JAMA. 2010;304(11):1198-1203.[PubMed]
Jonathan Sherbino
Jonathan is an emergency physician, trauma team leader and associate professor in the Division of Emergency Medicine at McMaster University and a Clinician Educator with the Royal College of Physicians & Surgeons of Canada. He operates the ICENet blog and KeyLIME Podcast.
Jonathan Sherbino

Latest posts by Jonathan Sherbino (see all)

Jesse Leontowicz

Jesse Leontowicz is a medical student and ski medic who attends the University of Saskatchewan where he focuses on improving student mental health and exploring his passion for teaching. When he's not occupied by lectures or helping patients, you can find him in the Rocky Mountains in search of fresh powder to ski.