FEI logo FINAL

CAEP FEI | Competency-based Assessment of Resuscitation Skills

In Featured Education Innovations (FEI) by Andrew HallLeave a Comment

Dougie, a junior Emergency Medicine resident, has just passed his LMCC Part II examination, where he was tasked to perform a series of standardized histories and physical exams with actors. Although he felt that not every scenario was particularly relevant to his future practice, he did find some benefit in the standardized assessment of competence. One station reminded him of a simulation lab case, which made him wonder: “Why doesn’t my sim lab also use standardized assessments?”

Last week, we posted a medical student’s perspective on the changes that competency based medical education (CBME) will bring. In this post, we take a different perspective, from the faculty who are in charge of innovating the changes, and how they hope to bridge a successful transition. This Feature Educational Innovation (FEI), titled “Competency-based Assessment of Resuscitation Skills by Simulation-based OSCE using the Queen’s Simulation Assessment Tool (QSAT)” was originally posted by the CAEP EWG FEI Team on April 17, 2015 and answers the question: “Is there a simple way to standardize assessment in simulation scenarios?” A PDF version is available here. A CAEP cast is available here.

Description of the Innovation

Background and Goals 

Assessment of clinical expertise in postgraduate medical education is moving away from knowledge-based examinations towards competency-based assessments.1 The use of high-fidelity simulation is emerging as an effective approach to competency-based assessments.2 Over a four year period, we developed and validated a modifiable anchored global assessment scoring tool for simulation-based Objective Structured Clinical Examinations (OSCEs) of resuscitation competence in postgraduate emergency medicine (EM) trainees: the Queen’s Simulation Assessment Tool (QSAT)

Methods

The Department of EM at Queen’s University implemented a longitudinal simulation-based resuscitation curriculum that has employed bi-annual simulation-based OSCEs to assess resident performance in resuscitation since 2009. In these exams, 20 to 25 EM residents are individually presented with 2 or 3 resuscitation scenarios and debriefed by a faculty member immediately following their performance.

The QSAT was developed for use in these exams, using a modified Delphi technique with a panel of EM physicians. It is a hybrid scoring tool comprised of four anchored domain scores and an overall global assessment score. The QSAT is unique in its simple and compact generic structure, which can be easily modified for utilization in any resuscitation scenario. The figure below demonstrates the generic QSAT, and a specific scenario QSAT with anchor modification.

Following an initial blueprinting technique, 10 standardized resuscitation OSCE scenarios were developed and administered to EM trainees. Video-recorded resident performances were scored by multiple blinded EM attending physicians trained in the use of the scenario specific QSATs. Utilizing the “unified model” of an argument for validity, originally proposed by Messick3we designed the QSAT, OSCE stations, and review process with principles of content and response process validity; and collected data relating specifically to the internal structure validity, relations with other variables such as level of training, and perceived benefit to learning.

Results

Discriminatory validity (Senior vs Junior) was excellent and inter-rater reliability showed acceptable levels of agreement for each scenario. Generalizability studies yielded G-coefficients ranging from 0.67 to 0.84. D-studies suggested that increasing the number of scenarios per OSCE (>6) with a single examiner per station would produce G-coefficients close to 0.90, which would be acceptable for high-stakes examinations. Resident trainees reported comfort being assessed in the simulation environment and found the simulation-based examination very valuable to their learning. Detailed descriptions of the QSAT development and validation and our program of simulation-based OSCE assessment are available4,5.

Reflective critique 

In summary, these OSCEs have become an important part of the assessment system within our EM program at Queen’s University and will help us as we transition to a fully integrated competency-based medical education (CBME) curriculum by July 2017. This assessment methodology has subsequently been evaluated in a multi-center study including 4 other Canadian sites.6 Looking forward, our next step is the administration of a single-center, 8-station, simulation-based resuscitation OSCE examining EM residents from across the country. It is our hope this form of resuscitation skills assessment will be included in the Royal College’s Competency by design (CBD)7 project for emergency medicine across Canada.

How have your institutions prepared for CBME? Have you had experience in developing your own assessment tools? Share your tips and tricks!

[bg_faq_start]

More About the CAEP FEI

This post was originally authored for the Canadian Association of Emergency Physicians (CAEP) Feature Educational Innovations project sponsored by the CAEP Academic Section’s Education Working Group and edited by Drs. Teresa Chan and Julien Poitras. CAEP members receive FEI each month in the CAEP Communiqué. CanadiEM will be reposting some of these summaries, along with a case/contextualizing concept to highlight some recent medical education literature that is relevant to our nation’s teachers. [bg_faq_end]

References

  1. Hamstra SJ: Keynote address: the focus on competencies and individual learner assessment as emerging themes in medical education research. Acad Emerg Med 2012; 19: 1336-43
  2. Sherbino J, Bandiera G, Frank J: Assessing competence in emergency medicine trainees; an overview of effective methodologies. Can J Emerg Med 2008; 10: 365
  3. Messick S: Validity, Educational Measurement, 3rd edition. Edited by Linn RL. New York, Macmillan, 1989, pp 13-103
  4. Hall AK, Dagnone JD, Lacroix LL, Pickett W, Klinger DA. Queen’s Simulation Assessment Tool (QSAT): Development and Validation of an Assessment Tool for Resuscitation OSCE Stations in Emergency Medicine. Simul Healthc 2015; 10: 67-132.
  5. Hagel CM, Hall AK, Dagnone JD. Queen’s Emergency Medicine Simulation OSCE: an advance in competency-based assessment. Canadian Journal of Emergency Medicine (in press)
  6. Dagnone JD, Hall AK, Woolfrey K, Davison C, Moore SM, McNeil G, Ross JR, Pickett W.  QSAT – Validation of a Competency-based Resuscitation Assessment Tool – A Canadian Multi-Centered Study [abstract].  Can J Emerg Med 2014; 15: S1
  7. Harris, KA, Frank JR. eds. Competence by Design: Reshaping Canadian Medical Education. The Royal College of Physicians and Surgeons of Canada 2014; Retrieved Mar 25, 2015, from: http://www.royalcollege.ca/portal/page/portal/rc/common/documents/educational_initiatives/rc_competency-by-design_ebook_e.pdf
Andrew Hall

Andrew Hall

Andrew Hall is Assistant Professor in the Department of Emergency Medicine at Queen’s University. He is a simulation-based resuscitation rounds instructor and runs the simulation-based OSCE assessment program for EM residents. Additionally, he is the CBME Lead for FRCPC-EM training program at Queen’s.

Damon Dagnone

Dr. J Damon Dagnone is an Associate Professor of Emergency Medicine at Queen’s University and is involved in numerous medical education initiatives. Currently he is the Medical Education Fellowship Program Director, Chair of the residency program Competence Committee, and Co-Lead of the Humanity in Healthcare inter-professional series. His main research areas include competency-based medical education, simulation-based assessment, and the medical humanities.In 2018, Damon published his first book “Finding Our Way Home: A Family’s story of life, love, and loss” about losing his son to cancer during his last year of EM residency training.When not in the ER, on ZOOM meetings, or exercising outdoors, Damon can be found enjoying his family at home & at his cottage.

Daniel Ting

Daniel Ting is an Emergency Physician and Clinical Assistant Professor at the University of British Columbia, based in Vancouver. He is the Editor-in-Chief of CanadiEM and a Decision Editor at the Canadian Journal of Emergency Medicine. He completed the CanadiEM Digital Scholarship Fellowship in 2017-18. No conflicts of interest (COI).