ImageSim Graph

ImageSim: Building Competency For Visually Diagnosed Tests in Emergency Medicine

In Knowledge Translation, Medical Concepts by Kathy BoutisLeave a Comment

Visually diagnosed medical tests (e.g. radiographs, electrocardiograms) are the most commonly ordered tests in front-line medicine. As such, front-line health care professionals are faced with the task of learning the skill of interpreting these images to an expert performance level by the time they provide opinions that guide patient management decisions. However, discordant interpretations of these images between front-line physicians and expert counterparts (radiologists, cardiologists) are a common cause of medical error1–9. In pediatrics, this problem is even greater due to the changing physiology with age leading to increased risk of interpretation errors.

Currently, most approaches to learning the interpretation of medical images include case-by-case exposure in clinical settings and tutorials that are either didactic or the passive presentation of cases on-line. However, these strategies have not demonstrated optimal effectiveness in clinical studies that examined the accuracy of front-line physicians in interpreting these images. Furthermore, many of the continuing medical education activities emphasize clinical knowledge, do not provide opportunities for feedback, and require little more than documentation of attendance, which limits the potential for improvement in the practicing physician10.

In an effort to bridge this knowledge-practice gap, we developed a medical education research program which answered a number of important research questions in an effort to make on-line learning effective for the emergency medicine physician11–16.

After 11 years of research, we have translated this evidence into a validated medical image interpretation learning system called ImageSim. This is a non-profit 24/7 on-line learning program offered by the Hospital for Sick Children, University of Toronto as residency training and Continued Professional Development course. ImageSim teaches health care professionals the interpretation of visually diagnosed medical tests using the concepts of deliberate practice and cognitive simulation. That is, our learning model includes sustained active practice of hundreds of cases where the learner actively makes the diagnosis for every case and then receives immediate specific feedback on their interpretation so that the participant instantly learns from each case (Figure 1).

ImageSim Figure 1

Figure 1:  After each case is interpreted by the learner, the learning system provides instant text and visual feedback with every case.

This way, we embed assessment for learning rather than the usual approach to assessment, which is assessment of learning. Importantly, we have presented these images as we encounter them in practice, and included a normal to abnormal radiograph ratio (with a spectrum of pathology) reflective of our day-to-day practice. After about 55 cases, the learner will get a running tabulation of performance measured as accuracy, sensitivity, and specificity. Participant performance will be compared to benchmarks for graduating residents (Bronze), emergency physicians (Silver), and specialized experts (Gold). See a demonstration here.

Our research to date shows that this learning method works – and all physicians at varying levels of expertise had significantly increased their accuracy in image interpretation along a “learning curve” as graphed below:

ImageSim Graph

Active courses are available here and currently include pediatric musculoskeletal injuries (7 modules, 200-400 cases per module, 2,100 cases total) and pediatric chest radiographs (450 cases). To be released early in 2018 are pre-pubertal female genital examination (150 cases) and pediatric point of care ultrasound (400 cases).

ImageSim has been accredited for Level 3 CME credits by the Royal College of Physicians and Surgeons and Level 2 CME credits by the College of Family Practice of Canada.  Registration information can be found here.

This post was copyedited and uploaded by Kevin Durr.

[bg_faq_start]

References

1.
Espinosa J, Nolan T. Reducing errors made by emergency physicians in interpreting radiographs: longitudinal study. BMJ. 2000;320(7237):737-740. [PubMed]
2.
Fleisher G, Ludwig S, McSorley M. Interpretation of pediatric x-ray films by emergency department pediatricians. Ann Emerg Med. 1983;12(3):153-158. [PubMed]
3.
Hallas P, Ellingsen T. Errors in fracture diagnoses in the emergency department–characteristics of patients and diurnal variation. BMC Emerg Med. 2006;6:4. [PubMed]
4.
Klein E, Koenig M, Diekema D, Winters W. Discordant radiograph interpretation between emergency physicians and radiologists in a pediatric emergency department. Pediatr Emerg Care. 1999;15(4):245-248. [PubMed]
5.
Margolis S, Nilsson K, Reed R. Performance in reading radiographs: does level of education predict skill? J Contin Educ Health Prof. 2003;23(1):48-53. [PubMed]
6.
Nesterova G, Leftridge C, Natarajan A, Appel H, Bautista M, Hauser G. Discordance in interpretation of chest radiographs between pediatric intensivists and a radiologist: impact on patient management. J Crit Care. 2010;25(2):179-183. [PubMed]
7.
Schuh S, Lalani A, Allen U, et al. Evaluation of the utility of radiography in acute bronchiolitis. J Pediatr. 2007;150(4):429-433. [PubMed]
8.
Walsh-Kelly C, Melzer-Lange M, Hennes H, et al. Clinical impact of radiograph misinterpretation in a pediatric ED and the effect of physician training level. Am J Emerg Med. 1995;13(3):262-264. [PubMed]
9.
Wei C, Tsai W, Tiu C, Wu H, Chiou H, Chang C. Systematic analysis of missed extremity fractures in emergency radiology. Acta Radiol. 2006;47(7):710-717. [PubMed]
10.
Moore D, Green J, Gallis H. Achieving desired results and improved outcomes: integrating planning and assessment throughout learning activities. J Contin Educ Health Prof. 2009;29(1):1-15. [PubMed]
11.
Boutis K, Pecaric M, Seeto B, Pusic M. Using signal detection theory to model changes in serial learning of radiological image interpretation. Adv Health Sci Educ Theory Pract. 2010;15(5):647-658. [PubMed]
12.
Boutis K, Pecaric M, Shiau M, et al. A hinting strategy for online learning of radiograph interpretation by medical students. Med Educ. 2013;47(9):877-887. [PubMed]
13.
Pusic M, Boutis K, Hatala R, Cook D. Learning curves in health professions education. Acad Med. 2015;90(8):1034-1042. [PubMed]
14.
Pusic M, Pecaric M, Boutis K. How much practice is enough? Using learning curves to assess the deliberate practice of radiograph interpretation. Acad Med. 2011;86(6):731-736. [PubMed]
15.
Pusic M, Andrews J, Kessler D, et al. Prevalence of abnormal cases in an image bank affects the learning of radiograph interpretation. Med Educ. 2012;46(3):289-298. [PubMed]
16.
Pusic M, Chiaramonte R, Gladding S, Andrews J, Pecaric M, Boutis K. Accuracy of self-monitoring during learning of radiograph interpretation. Med Educ. 2015;49(8):838-846. [PubMed]
[bg_faq_end]
Kathy Boutis

Kathy Boutis

Dr. Kathy Boutis is a pediatric emergency physician at The Hospital for Sick Children (SickKids), a Senior Associate Scientist in the Child Health Evaluative Sciences Program at SickKids Research Institute, and an Associate Professor with the University of Toronto.
Martin Pusic

Martin Pusic

Dr. Martin Pusic is a pediatric emergency physician, an Associate Professor at the Ronald O. Perelman Department of Emergency Medicine, and the Director of the Division of Learning Analytics Education and Training at NYU Langone Health.