CAEP GeMES | POCUS Learning Curve

In Great Evidence in Medical education Summary (GEMeS) by Kristen WeersinkLeave a Comment

Geoff is a first year Emergency Medicine resident on his ultrasound rotation. Over the course of four weeks, he takes big steps in gaining competency in point-of-care ultrasound (POCUS). Most of the scans he finds quite easy. However, Geoff finds the obstetrical scans particularly tricky, especially given the high implications of misinterpreted scans when ruling out ectopic pregnancy. He feels it somewhat counterintuitive that obstetrical area asks for the same number of scans compared to other areas.

As competency-based education becomes the norm, mandatory skill certification is likely to become more prevalent. In POCUS, the Canadian Emergency Ultrasound Society (CEUS) asks for 50 scans in each core area to attain CEUS-Independent Practitioner status. This “Great Evidence in Medical education Summary” (GEMeS – pronounced “gems”) was originally posted by the CAEP EWG GEMeS Team on April 11, 2016 and answers the question: “What is the bedside ultrasound (US) learning curve of Emergency Medicine (EM) trainees, and when do they reach a performance plateau?” A PDF version of the GEMeS summary is available here.

Education Question or Problem:

What is the bedside ultrasound (US) learning curve of Emergency Medicine (EM) trainees, and when do they reach a performance plateau?

Bottom Line:

Current recommendations of the performance of 50 scans as a marker of competency may underestimate the experience needed to achieve good sensitivity and specificity in the majority of EM ultrasound (US) examinations using expert ED ultrasonographers as the criterion standard.
DETAILS
Reference
Blehar DJ, Barton B, Gaspari RJ. Learning Curves in Emergency Ultrasound Education. Academic Emergency Medicine 2015; 22:574-82.
10.1111/acem.12653
Study Design
This was a retrospective review of an educational database from a single EM residency training program over 5 years. Each examination was scored for agreement between initial interpretation and expert final review. Novice ultrasonographers’ learning curves were plotted to establish the performance plateau for each type of US examination studied.
Funding Sources
None reported.
Setting
Four different emergency departments (ED) under one EM residency training program in the United States, ranging from a small community ED to a level I trauma center.
Level of Learning
All levels of residents and attending physicians in EM without prior US training or experience.

Synopsis of Study

A retrospective review of an educational database of 52,408 US examinations was undertaken to characterize learning curves of novice ultrasonographers and to compare their skills to expert ED ultrasonographers. Digital video recording of every US exam was reviewed by unblinded physician experts based on image interpretation, image acquisition skill, and resultant image quality using a center specific standardized rating scale. Trainees received immediate feedback via email throughout the study period. US examinations included aorta, cardiac, chest wall, endovaginal uterine, focused assessment with sonography in trauma (FAST), lower extremity duplex, renal, right upper quadrant, and soft tissue. Learning curves for each method were analyzed to determine the plateau point where experiential benefit diminished and to compare to an expert reference curve.
Overall image quality and learning curves for image acquisition and interpretation differed by US application, with most displaying a slow steady improvement until a plateau point. In summary:
  • Performance plateaus for image interpretation occurred later in FAST (57), chest wall (60), renal (78) and aorta (66) compared to both cardiac (30) and soft tissue examinations (27)
  • Endovaginal uterus and lower extremity duplex did not have definable plateau points in the present study.
  • All US protocols had excellent specificity.
  • Excluding the FAST exam, a threshold of 50 ultrasound scans for each protocol was found to yield a sensitivity of 84% and a specificity of 90%
For the FAST exam, 50 scans yielded a sensitivity of 80% and a specificity of 96%

Why is it relevant to Emergency Medicine Education?

POCUS has a multitude of clinical applications, has become part of the standard practice of EM, and is an integral part of most EM residency training programs. Current guidelines utilize number of scans performed as a measure of competency. This is based on expert opinion and consensus alone, with little supporting evidence1,2. In the present study, Blehar et al. seek to explore the learning curves associated with image acquisition and interpretation in POCUS to better understand the threshold at which competency can be achieved and how best to assess this skill.
Performance plateaus found in this study offer a guide to understanding skill acquisition over time in novice ultrasonographers and a potential threshold point beyond which little improvement occurs with more hands on experience. The current recommendation of 25-50 scans1,2 is sufficient for some (i.e., soft tissue, cardiac) but too low for other examination types (i.e., renal, aorta, etc.). The present review concludes that 50-75 scans is a good benchmark level to achieve excellent image acquisition and interpretation in the majority of examinations. This study highlights some limitations to using a constant number of scans as the measure of competence in POCUS and offers more information on how a given experience level translates into a predicted level of performance.

References

1.     Emergency ultrasound guidelines. Ann Emerg Med 2009; 53:550-70.
2.     Lewiss RE, Pearl M, Nomura JT, et al. CORD-AEUS:consensus document for the emergency ultrasound milestone project. Acad Emerg Med 2013; 20:740-5.
3.     Blehar DJ, Barton B, Gaspari RJ. Learning Curves in Emergency Ultrasound Education. Academic Emergency Medicine 2015; 22:574-82.

Many EM skills do not currently require certification for independent practice. Do you anticipate increased skill certification with competency-based medical education? Would this be a good or bad thing?

[bg_faq_start]

More About the CAEP GEMeS

This post was originally authored for the Canadian Association of Emergency Physicians (CAEP) Great Evidence in Medical Education Summaries (GEMeS) project sponsored by the CAEP Academic Section’s Education Working Group and edited by Drs. Teresa Chan and Julien Poitras. CAEP members receive GEMeS each month in the CAEP Communiqué. CanadiEM will be reposting some of these summaries, along with a case/contextualizing concept to highlight some recent medical education literature that is relevant to our nation’s teachers.

[bg_faq_end]

Kristen Weersink

Kristen Weersink

Kristen is a PGY2 in Emergency Medicine at Queen's University. She has an interest in education and is helping with the transition of emergency medicine to CBME at Queen's and across the country.
Kristen Weersink

Latest posts by Kristen Weersink (see all)

Andrew Hall

Andrew Hall

Andrew Hall is Assistant Professor in the Department of Emergency Medicine at Queen’s University. He is a simulation-based resuscitation rounds instructor and runs the simulation-based OSCE assessment program for EM residents. Additionally, he is the CBME Lead for FRCPC-EM training program at Queen’s.
Daniel Ting

Daniel Ting

Daniel Ting is an Emergency Physician and Clinical Assistant Professor at the University of British Columbia, based in Vancouver. He is the Editor-in-Chief of CanadiEM and a Decision Editor at the Canadian Journal of Emergency Medicine. He completed the CanadiEM Digital Scholarship Fellowship in 2017-18. No conflicts of interest (COI).