CAEP GeMES | Workplace-Based Assessment Design Optimization

In Great Evidence in Medical education Summary (GEMeS) by Jeffrey LandrevilleLeave a Comment

It’s the end of another long Emergency Department shift, and Dr. Wu’s resident approaches him to complete her shift evaluation form. She had a solid shift. Dr. Wu self-admittedly finds every post-shift debrief awkward and gives his usual bland assessment (“You are right on track”) and cursory feedback (“Continue reading around cases”). Later that evening, Dr. Wu thinks about the interaction and feels a bit guilty. He wonders how he might be able to give his residents more valuable assessment.

As Competency-Based Medical Education becomes the norm, validated assessment will be paramount to ensuring residents are meeting their competencies. This “Great Evidence in Medical education Summary” (GEMeS – pronounced “gems”), titled “Workplace-based assessment design optimization” was originally posted by the CAEP EWG GEMeS Team in July 2016 and answers the question: “Workplace-based assessment (WBA) often relies on judgements made on rating scales. Given that judgements are subjective, are there WBA design characteristics that can be modified to optimize validity and reliability?” (1) A PDF version of the GEMeS summary is available here.

Article Reference

Crossley, J., & Jolly, B. (2012). Making sense of work-based assessment: Ask the right questions, in the right way, about the right things, of the right people. Medical Education, 46(1), 28–37. PMID: 22150194 http://www.ncbi.nlm.nih.gov/pubmed/22150194

Why is it relevant to Emergency Medicine education?

There exists an intimate relationship between learning and assessment (2). WBAs have become an increasingly important means of assessing trainees in day-to-day clinical practice. Further, WBA plays an important role in reporting expert judgments of trainee competence. Developing an understanding of how to optimize the validity and reliability of WBAs is a critical step to improving the utility of these forms of assessment.

Level of Evidence

Not applicable

Funding Sources

None

Study Design and Setting

The authors present evidence from selected studies in combination with expert opinion to produce this think piece. The study took place at the Academic Unit of Medical Education, University of Sheffield, Sheffield, United Kingdom. The level of learners included UGME, PGME and CME.

Synopsis

In this selective narrative review, the authors begin with an argument that most WBAs demonstrate poor psychometric properties, are vulnerable to assessor differences, and do not discriminate between trainees. Based on this, the authors attempt to identify features of WBA design that may contribute to improved validity and reliability.

Four general principles emerged from their study:

  1. The response scale should be aligned to the reality map of the judges (i.e., anchors that resonate with assessors’ experiences might be a more profitable avenue of exploration than abstract descriptors such as “satisfactory”)
  2. Judgements rather than objective observations should be sought (i.e., subjective judgments about outcome-level performance may result in more assessor agreement than objective responses about what actually took place)
  3. The assessment should focus on competencies that are central to the activity observed (i.e., assessment tools should target those domains of performance that are clearly demonstrated in the activity being observed)
  4. The assessors who are best-placed to judge performance should be asked to participate (i.e., individuals who have the competence to judge an aspect of performance, and have had the opportunity to observe it should be asked to assess the trainee)

BOTTOM LINE

In this study, the authors suggest four general principles that might enhance the reliability and validity of WBA. (1) The response scale should be aligned to the reality map of the judges (i.e., scales should be clinically anchored); (2) Judgments rather than objective observations should be sought; (3) The assessment should focus on competencies that are central to the activity observed, and; (4) The individuals who are best-placed to judge performance should be asked to participate.

References

1. Crossley J, Jolly B. Making sense of work-based assessment: Ask the right questions, in the right way, about the right things, of the right people. Med Educ. 2012;46(1):28-37. doi:10.1111/j.1365- 2923.2011.04166

2. Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Med Teach. 2007;29(9):855-871. doi:10.1080/01421590701775453.

One of the more powerful WBAs is direct observation of learners by preceptors on shift. Have your departments incorporated direct observation or other WBA methods? Share successes and challenges!

More About the CAEP GEMeS

This post was originally authored for the Canadian Association of Emergency Physicians (CAEP) Great Evidence in Medical Education Summaries (GEMeS) project sponsored by the CAEP Academic Section’s Education Working Group and edited by Drs. Teresa Chan and Julien Poitras. CAEP members receive GEMeS each month in the CAEP Communiqué. CanadiEM will be reposting some of these summaries, along with a case/contextualizing concept to highlight some recent medical education literature that is relevant to our nation’s teachers.

(Visited 199 times, 1 visits today)
Jeffrey Landreville

Jeffrey Landreville

Jeff is currently a senior resident in the FRCPC Emergency Medicine Program at the University of Ottawa. He is interested in medical education and plans to pursue a career as a clinician educator. Current research interests include Emergency Medicine Oral Case Presentations.
Jeffrey Landreville

Latest posts by Jeffrey Landreville (see all)

Warren Cheung

Warren Cheung

Warren holds a Junior Clinical Research Chair in Medical Education and is an Assistant Professor in the Department of Emergency Medicine at the University of Ottawa. His interests and training include education research and medical education.
Daniel Ting
Daniel Ting is a PGY-5 Emergency Medicine resident physician at the University of British Columbia (Interior Site). He was the 2017-18 CanadiEM Digital Scholars Fellow and the inaugural Editorial Intern at the Canadian Journal of Emergency Medicine. Daniel leads the “Great Evidence in Medical Education Summary” and "Feature Educational Innovation" series on CanadiEM.
Daniel Ting
- 2 days ago