CAEP GeMES | Workplace based assessments in competency based education

In Great Evidence in Medical education Summary (GEMeS) by Andrew DixonLeave a Comment

Aurora, a junior resident, is in a meeting with her residency training committee when the topic of competency-based education arises. Her institution will implement competency-based training starting next year and her program director has major reservations. Her program director says, “I can’t see how we can have time to evaluate all of these tasks for learners while we are supposed to simultaneously manage a complex and busy department!” Aurora wonders what these evaluations might look like and whether they would be pragmatic and worthwhile.

Competency based education will revolutionize much of residency training across Canada. Although some tasks in Emergency Medicine are hard to evaluate, many other learning competencies can be defined and evaluated in a short period of time. For these tasks, what is the value of specific assessments for faculty to give observed feedback to learners? Might these assessments identify weaknesses, speed up the learning curve and help shorten residency training? This “Great Evidence in Medical education Summary” (GEMeS – pronounced “gems”) was originally posted by the CAEP EWG GEMeS Team on May 15, 2015 and answers the question: “In the context of a competency-based model, does the development of workplace based assessments improve the quality of resident feedback?” A PDF version of the GEMeS summary is available here.

Education Question or Problem

Generalist programs such as emergency medicine (EM) require a substantial number of work place based assessments to evaluate residents in a competency based model. Does the development of such assessments improve the quality of resident feedback?

Bottom Line 

Work-based assessments completed on a daily shift basis clearly improve the quality of end of rotation reports and appear to increase the incidence of daily formative feedback.
DETAILS
Reference
Acad Med. 2015 Jul;90(7):900-5.
The McMaster Modular Assessment Program (McMAP): A Theoretically Grounded Work-Based Assessment System for an Emergency Medicine Residency Program.
Chan T, Sherbino J; McMAP Collaborators.
PMID: 25881648
DOI: 10.1097/ACM.0000000000000707
Study Design
Single centre pre and post intervention evaluation of rotation report quality.
Funding Source
McMaster University Division of Emergency Medicine.
Setting
McMaster University Emergency Medicine Residency Training Program.
Level of Learning
Postgraduate Years 1 and 2.

Synopsis of Study

The group developed a series of 52 EM-specific Work-Based Assessment (WBA) instruments structured as partial mini-clinical evaluation exercises, which were reviewed and adapted by a panel of American and Canadian staff and residents. The panel matched the instruments to specific CanMEDS or ACGME competencies, checked for relevance and during this process dropped 10 of the WBA tools.

Every shift, residents were observed by the attending physician, who rated the performance of a specific defined task and their global performance. The observation and documentation time was approximately 5-10 minutes per shift. There was a requirement for narrative comments to augment the numerical scores and stimulate formative end of shift feedback.

The assessments were entered into a central online portal and used to create a qualitative end of rotation report discussing 1) Specific task performance 2) Global performance 3) Tailored continuous improvement advice.

To determine efficacy, 25 pre-intervention end of rotation reports were compared, by two independent reviewers, to 25 post-intervention reports for quality with the Completed Clinical Evaluation Report Rating (CCERR) tool which has been validated across a wide range of specialties. There was a doubling in the quality scores from 13.8/45 to 27.5/45 (p < 0.001), all nine item sub-scores also increased significantly and resident focus groups indicated a greater incidence of formative feedback.

All indicating that rotation reports generated from multiple directed daily evaluations were superior to the traditional single faculty recall system.

Why is it relevant to Emergency Medicine Education?

This study shows that by defining clinically identifiable EM tasks, and using those as a basis for providing feedback, the quality of feedback can be improved. All programs struggle with providing effective feedback to learners. This educational development gives us a fresh way of providing feedback for now, and indicates a reasonable path towards true competency based emergency education.

What barriers do you perceive in the implementation of competency based education in Emergency Medicine training? What are the biggest advantages of competency-based training?

More About the CAEP GEMeS

This post was originally authored for the Canadian Association of Emergency Physicians (CAEP) Great Evidence in Medical Education Summaries (GEMeS) project sponsored by the CAEP Academic Section’s Education Working Group and edited by Drs. Teresa Chan and Julien Poitras. CAEP members receive GEMeS each month in the CAEP Communiqué. CanadiEM will be reposting some of these summaries, along with a case/contextualizing concept to highlight some recent medical education literature that is relevant to our nation’s teachers.

(Visited 479 times, 1 visits today)
Andrew Dixon

Andrew Dixon

Andrew Dixon is a pediatrician at the University of Alberta. He has many interests in education, including simulation feedback, fracture management and patient education.
Daniel Ting

Daniel Ting

Daniel Ting is an Emergency Physician and Clinical Assistant Professor at the University of British Columbia, based in Vancouver. He is the Editor-in-Chief of CanadiEM and a Decision Editor at the Canadian Journal of Emergency Medicine. He completed the CanadiEM Digital Scholarship Fellowship in 2017-18. No conflicts of interest (COI).
Daniel Ting
- 10 hours ago