Anecdotal evidence: what’s the harm?

In Editorial, Opinion by Shahbaz Syed5 Comments

“The last case I had like this, the patient ended up having an aortic dissection.”

We hear statements like this on a daily basis in medicine, and whether or not we’re cognizant of it, it may significantly influence our practice. Anecdotal evidence is data garnered from stories or experiences. In a medical context it is often based on one (or more) patient interactions [1]. After seeing a rare disease, or missing a potentially dangerous diagnosis, we are naturally inclined to over-investigate that entity, regardless of what the evidence would suggest we do. We regularly hear anecdotes from our patients as well: “my friend had pain like this, they could never figure out what it was – and then she was diagnosed with cancer.”

As a social species, we embrace such anecdotes because storytelling is an innate ability of ours and it is easy to comprehend anecdotes but difficult to grasp scientific evidence. From an evolutionary perspective, it is advantageous for us to be cognizant of one’s experience – by touching a fire, and identifying that it is hot and painful, we learn not to do that again in the future. In the same way, it is our nature to be more cautions with particular diagnoses or presentations after missing we get ‘burned’ and make a mistake (or even hear about one).

Heuristically thinking

There have been many sociological theories on cognitive processing, and “thinking about how we think.” One of the most utilized models is the concept of heuristics [2]. A heuristic technique can be thought of as a cognitive shortcut to problem solving or, “thinking fast.” However, heuristics can produce incorrect answers or assumptions due to cognitive biases. Anecdotal evidence is fraught with these biases.

One of the major heuristic biases implicit within anecdotal evidence is the availability bias. When an infrequent event is easily or vividly recalled, individuals tend to overestimate the power and frequency of that event. For example; natural disasters or shark attacks are incredibly rare compared to strokes, diabetes, and hypertension (relatively mundane events), however, they are recalled quickly when they occur. Other biases that affect anecdotes including reporting bias, regression to the mean, confirmation, and confounding bias are discussed below [3-6].

Anecdotal Evidence

Does anecdotal evidence have a role in medicine?

Evidence based medicine is the current cornerstone for medical research, however, this term wasn’t coined in the literature until 1992 [7]. Decades ago, scientists didn’t conduct multicentre randomized controlled trials; their data was based upon case series and anecdotal evidence. Similarly, early physicians treated patients based on anecdotal evidence (which is likely why heroin was used as a cough suppressant for children in the early 1900s [8]).

Given the potential biases and heuristic flaws inherent with anecdotal evidence, it is difficult to say it has a role to play in modern day medicine. However, anecdotal data might play a role in highlighting rare cases. By doing so, it demonstrates that outliers are within the realm of possibility and something to consider. Every physician has had a case where the patient does fit the “textbook” description of a particular disease. Experienced clinicians who have a wealth of anecdotes at their exposure may be able to come to the correct diagnosis more readily because it is within their realm of experience.

It is important for us to differentiate the influence of clinical gestalt in comparison to anecdotal evidence. Clinical gestalt is a highly efficacious tool for physicians who can use it to describe their perception of a patient’s ailment. However, it is imperative that the physician recognizes and separates the influence of anecdotal evidence upon their clinical gestalt, as there is the potential for harm when the physician discredits evidence and gestalt to operate based on anecdotes. 

Patient perceptions

Even if physicians are aware of the influence of anecdotal evidence on their practice, patients often base their expectations on their own anecdotes. Patients generally do not read medical trials, and even those that do will have as much or more difficulty grasping their results without context. Unfortunately, we do not do an excellent job of explaining this to them.

Patients are much more likely to engage in shared decision making (one way of helping to alleviate the impact of anecdotal evidence) if they can understand the risk as well as the benefit of potential investigations and treatments [9]. For example, aortic dissection is a rare but nebulous entity, and significant practice variation exists among the physicians trying to diagnose. A patient might wish to forgo a CT scan of the chest, if they were aware of their (generally low) pretest probability relative to the radiation risks associated with the CT scan.

From a communication standpoint, our explanations may be aided by [9]:

  • Avoiding purely descriptive terms
  • Using commonly vocabulary
  • Offering both positive and negative outcomes
  • Using absolute numbers rather than relative risks or odds ratios
  • Showing visual aids to illustrate probabilities

These strategies may help patients come to terms with the risks associated with a particular investigation or treatment (or lack of), and may help to shed their view of anecdotal evidence, to buy in to the evidence based approach.

Closing thoughts

The influence of anecdotes within medical practice are hard to tease out. It is important for us to be cognizant of our experiences and how they influence our practice. While taking into account our experiences, we must not let anecdotal evidence override evidence based medicine as the cornerstone of our practice.

Please share your thoughts with us below.

[bg_faq_start]

Biases

Reporting Bias

Anecdotal stories are subject to a multitude of other biases, reporting bias represents the “selective revealing or suppression of information” [3] by subjects, whether intentionally or not. One way to conceptualize this is that the dead don’t talk; a patient who succumbs to their illness is unable to discuss the efficacy (or lack of) of their particular treatment. Similarly, when patients do not derive a benefit from therapy, it is unlikely to make headlines. Doctor Oz is an excellent example of the power of reporting bias, as he often claimed particular nutritional supplements were “miracle cures”. These statements were justified by Dr. Oz under the premise that the word “miracle” was a descriptive term for anecdotal evidence.

Confirmation Bias

Confirmation bias represents the natural tendency to select or interpret information that helps to affirm one’s underlying hypothesis or beliefs, while minimizing the contributions of other possibilities [4]. This is demonstrated by the anti-vaccination movement, where anecdotal evidence is utilized to confirm their belief that vaccines cause autism, while discrediting scientific data demonstrating the contrary.

Regression to the mean

Regression to the mean is a statistical construct that suggests when measured, an extreme variable is more likely to be closer to the average on its second measurement [5]. In a medical context, this would suggest that disease can often have a fluctuating course, and our perception of successful treatment may simply be the disease in flux. For example, a patient with acne may associate a particular soap, or ointment with successful resolution of the acne, when it would have improved on its own regardless.

Confounding

Often times, a disease may be self-limiting, and will improve on its own without any intervention (or rather, any interventions will have no effect). The perception that a particular therapy is efficacious without comparison to a control is difficult, as the illness would have resolved on its own [6]. An excellent example of this is the common cold; pharmacies are overstocked with cold medications, many have their own personal home-remedies, and yet there is no scientific evidence to support the use of any of these things. The perception that these treatments are effective is only made because the disease was self-limiting, and was going to improve regardless of the therapy utilized.

[bg_faq_end]

[bg_faq_start]

References

  1. Nunn, R. Mere anecdote: evidence and stories in medicine. Journal of Evaluation in Clinical Practice. 2001.17; 920-926.
  2. Kahneman, D. (2011). Thinking Fast and Slow. New Yori: Farrar, Straus and Giroux.
  3. Porta, M. (2008). A Dictionary of Epidemiology. Oxford University Press. p. 275. ISBN978-0-19-157844-1. Retrieved 23 January 2016.
  4. Plous, S (1993). The Psychology of Judgment and Decision Making. p. 233
  5. Bland, JM, Altman DG: Regression towards the mean. BMJ. 1994, 308:1499.
  6. Pannucci, CJ. Wilkins, EG. Identifying and Avoiding Bias in Research. Plastic Reconstructive Surgery. 2010. 126(2); 619-625
  7. Guyatt, GH. Sackett, DL. Evidence Based Medicine: A new approach to teaching the practice of medicine. JAMA. 1992. 268(17);2420-2426.
  8. Agnea, J. (2010). Medicine in the Old West: A History, 1850-1900). McFarland and Company, Jefferson, North Carolina.
  9. Paling J. Strategies to help patients understand risks. BMJ. 2003. 327(7417); 745-748.
[bg_faq_end]

Shahbaz Syed

FRCPC Emergency Medicine Physician at the University of Ottawa, with a fellowship in Digital Scholarship, and an special interest in rational resource utilization. Additionally, holds a role as editor for CanadiEM, and is the junior social media editor for CJEM.

Frontdoor 2 Healthcare

Frontdoor2Healthcare, founded by Dr. Edmund Kwok in 2012, provides editorial and commentary on issues affecting Canadian healthcare from the emergency department’s “front door” perspective. Frontdoor posts allow for open sharing of the diverse opinions and perspectives of emergency physicians from across the country.

Latest posts by Frontdoor 2 Healthcare (see all)