CAEP GeMES | The effectiveness of feedback by faculty to learners is often challenged by faculty factors

In Great Evidence in Medical education Summary (GEMeS) by Peter RogersLeave a Comment

Dr. Hoag thoroughly enjoys working with Penelope, one of the senior residents at his hospital. He admires her knowledge, diligence on shifts and involvement in academic pursuits. Outside of the hospital, Dr. Hoag has become a mentor to Penelope, and often offers career advice. One day, Dr. Hoag observes Penelope treat a difficult psychiatric patient, and Penelope uncharacteristically loses her cool and verbally lashes back at the patient. After the shift, Dr. Hoag wants to give feedback to Penelope, but finds it challenging because of their relationship. How can educators help Dr. Hoag?

Unlike many other specialties, Emergency Medicine shifts have a daily opportunity for preceptors to give one-on-one feedback to their learners. Therefore, it is important emergency physicians develop skills in addressing awkward topics and giving useful feedback. This “Great Evidence in Medical education Summary” (GEMeS – pronounced “gems”) was originally posted by the CAEP EWG GEMeS Team on March 20, 2015 and answers the question: “Does a one hour education session enable faculty to provide quality feedback to learners in a simulated setting?” A PDF version of the GEMeS summary is available here.

Education Question or Problem

The effectiveness of direct and timely feedback by faculty to learners is often challenged by faculty cognitive biases, time constraints and concerns about harming their relationship with the learner.

Bottom Line

In this study, a short educational intervention was effective in improving faculty feedback and helping them address uncomfortable topics around performance of the learner in a simulated setting.
DETAILS

THE EFFECTIVENESS OF DIRECT AND TIMELY FEEDBACK BY FACULTY TO LEARNERS IS OFTEN CHALLENGED BY FACULTY COGNITIVE BIASES, TIME CONSTRAINTS AND CONCERNS ABOUT HARMING THEIR RELATIONSHIP WITH THE LEARNER.

Reference
Minehart RD, Rudolph J, Pian-Smith MCM, Raemer DB.
Improving faculty feedback to resident trainees during a simulated case: A randomized, controlled trial of an educational intervention. Anesthesiology. 2014 Jan;120(1):160-71.
Study Design
Randomized, controlled trial of an educational intervention
Funding sources
Foundation for Anesthesia Education and Research (FAER) Research in Education Grant (REG).
Setting
The Center for Medical Simulation. The study was conducted during a recurring, mandatory, simulation-based crisis management course for practicing anesthesiologists from five academic hospitals in greater Boston, Massachusetts.
Level of Learning
Practicing physicians: feedback was given to a simulated resident.

Synopsis of Study

A balanced randomization (1:1), rater-blinded, parallel-group-controlled experiment was conducted during regularly scheduled faculty teamwork and communication simulation sessions. Seventy-one anesthesiologists were randomly assigned to intervention and control groups. The intervention consisted of a one hour video and role-play workshop on how to resolve the perceived task versus relationship dilemma, how to diagnose trainees’ learning needs, and how to address different kinds of errors including professionalism lapses.

The experimental case scenario consisted of two parts. The first part allowed the participant to observe a simulated resident commit four errors while managing a simulated patient. In the second part, the participant engaged in a feedback conversation with the resident about his/her performance.

Debriefing sessions were rated by four experienced, blinded raters, using both a behaviourally anchored rating scale (BARS) and an objective 12-point feedback assessment instrument to assess the style/pattern of feedback given. Average ratings for the intervention group were higher (4.2 ± 1.28) than the control group (3.8 ± 1.22; p < 0.0001) indicating better ability to maintain a psychologically safe environment while providing feedback, to structure the feedback session in an organized manner and to identify and explore performance gaps.

Specifically, participants in the intervention group were more likely to use a preview statement to commence the feedback session and they more commonly used advocacy/inquiry model of communication (see Figure 1). They less commonly used “guess what I am thinking” questioning. They were also more likely to address professionalism errors, while the control group tended to focus on clinical errors.

Training therefore improved faculty ability to, not only maintain a psychologically safe environment during feedback, but to also explore the resident’s cognitive frame, and to address professionalism along with technical issues.
Figure 1. Visual summary of advocacy/inquiry approach to debriefing. Center for Medical Simulation, Boston MA, 2014.

Why is it relevant to Emergency Medicine Education?

Emergency Medicine physicians are tasked with providing feedback to a variety of learners on a regular basis. Providing direct and timely feedback to trainees can be a challenge in any academic setting. Since learners come from variable backgrounds and levels of training, it is important for faculty to explore and understand the context of the student and the factors that have contributed to their performance.
Addressing poor clinical performance or other CanMEDS roles such as professionalism can be particularly stressful for both faculty and learners. This paper demonstrated that a brief (one hour) intervention was effective in enabling faculty to provide quality feedback to learners in a simulated setting. Consequently, future research should investigate whether these findings are observed in a real clinical setting by assessing the transfer of such skills in that setting. Focused instruction on similar methods of feedback could easily be adopted for an Emergency Medicine faculty development session.

Does your institution train its faculty on feedback? One common CAEP teaching course is “ED STAT!”–have you found it (or other courses) to elevate the level of feedback you give?

[bg_faq_start]

More About the CAEP GEMeS

This post was originally authored for the Canadian Association of Emergency Physicians (CAEP) Great Evidence in Medical Education Summaries (GEMeS) project sponsored by the CAEP Academic Section’s Education Working Group and edited by Drs. Teresa Chan and Julien Poitras. CAEP members receive GEMeS each month in the CAEP Communiqué. CanadiEM will be reposting some of these summaries, along with a case/contextualizing concept to highlight some recent medical education literature that is relevant to our nation’s teachers.

[bg_faq_end]

Peter Rogers

Peter Rogers is the Program Director for the CCFP(EM) Program at Memorial University. Medical education, simulation and ultrasound are among his specialties.

Michael Parsons

Michael Parsons is the Assistant Program Director at Memorial University for the CCFP(EM) Program. His research interests include PoCUS, simulation and procedural skills training.
Tia Renouf

Tia Renouf

Tia Renouf is Chair of Emergency Medicine at Memorial University. Her research areas include global health and remote health.
Sabrina Alani

Sabrina Alani

Sabrina Alani is a research assistant at Memorial University in both Emergency Medicine and Oncology.

Adam Dubrowski

Adam Dubrowski is Associate Professor in Emergency Medicine at Memorial University. His primary areas of expertise are simulation and virtual learning environments.

Daniel Ting

Daniel Ting is an Emergency Physician and Clinical Assistant Professor at the University of British Columbia, based in Vancouver. He is the Editor-in-Chief of CanadiEM and a Decision Editor at the Canadian Journal of Emergency Medicine. He completed the CanadiEM Digital Scholarship Fellowship in 2017-18. No conflicts of interest (COI).