CAEP GeMES | Selection of an Appropriate Tool for Direct Observation of Trainees by Supervisors

In Great Evidence in Medical education Summary (GEMeS) by Alexandra Stefan2 Comments

Catherine is a 3rd year medical student who is finishing the last shift of her core emergency medicine rotation.  She is interested in matching to neurology. During the shift, she asks her staff, Dr. Stewart, if he would be willing to observe her while she performs a neurological exam and provide feedback on her performance.  Dr. Stewart agrees, but is unsure of what aspects of the assessment he should focus on during the observation. 

Direct observation of a trainee by a staff physician can provide information for the staff in terms of the trainee’s progress, and also provides the trainee with valuable feedback on how to improve. This “Great Evidence in Medical education Summary” (GEMeS – pronounced “gems”), titled “Selection of an Appropriate Tool for Direct Observation of Trainees by Supervisors” was originally posted by the CAEP EWG GEMeS Team in July 2017 and answers the question: “How do we select an appropriate tool for direct observation of trainees by supervisors that maximizes educational benefits for trainees and is feasible for faculty?” A PDF version of the GEMeS summary is available here.

Article Reference

Hauer KE, Holmboe ES, Kogan JR. (2011). Twelve tips for implementing tools for direct observation of medical trainees’ clinical skills during patient encounters. Medical Teacher, 33(1):27-33.

Why is this paper relevant to Emergency Medicine education?

In the context of competency-based medical education, direct observations of trainees are critical for assessing competence and deciding if the learner is ready to progress through a training program. The use of validated tools coupled with faculty development around providing meaningful feedback and developing learner action plans will enhance the educational experience and contribute to better patient care.

Level of Evidence/Level of Learning/Funding Sources

Not applicable/UGME, PGME/None

Study Design

The authors present 12 tips for selecting and using a tool for direct observation of clinical encounters based on their previous systematic review of the literature (Kogan JR et al., JAMA, 2009).


Direct observation of undergraduate and postgraduate medical trainees with actual patients is an essential and underused component of clinical education and evaluation. Lack of direct observation in clinical training may result in missed opportunities to hone clinical skills and may lead to patient safety concerns.

Clinical curricula should incorporate direct observations of trainees interacting with patients accompanied by stage of learning-appropriate feedback on strengths and weaknesses. The use of direct-observation assessment tools provides faculty with formative and summative data on a trainee’s competence across multiple domains including patient care and communication skills.

Identification of an appropriate tool coupled with faculty and learner training on how to use it are essential in ensuring successful implementation. The authors offer the following practical tips for implementing
direct observation tools:

  1. Define competencies and objectives for the program to guide use of a tool for direct observation
  2. Determine whether the purpose of the direct observation program is formative or summative assessment
  3. Identify an existing tool for direct observation rather than creating a new one
  4. Create a culture that values direct observation
  5. Conduct faculty development on direct observation
  6. Build meaningful feedback into the direct observation process and train faculty to provide effective feedback
  7. Require action planning after each direct observation
  8. Orient learners to direct observation and feedback
  9. Apply the tool multiple times per trainee
  10. Develop systems that accommodate direct observation of clinical skills
  11. Measure outcomes of the direct observation of clinical skills program
  12. If a new tool is developed for use, try to assess its validity


Direct observation of trainees interacting with patients by a faculty member can provide useful information for both the faculty member and the trainee. The successful implementation of this form of evaluation depends on the identification of an appropriate tool, as well as education provided to faculty members and trainees on its use.

What have your experiences been using direct observation as a method of evaluation (for teachers or learners)? What steps can you take to help implement this teaching tool in your emergency department?


More About the CAEP GEMeS

This post was originally authored for the Canadian Association of Emergency Physicians (CAEP) Great Evidence in Medical Education Summaries (GEMeS)  project sponsored by the CAEP Academic Section’s Education Working Group and edited by Drs. Teresa Chan and Julien Poitras. CAEP members receive GEMeS each month in the CAEP Communiqué. CanadiEM will be reposting some of these summaries, along with a case/contextualizing concept to highlight some recent medical education literature that is relevant to our nation’s teachers.


Alexandra Stefan

Alexandra is a clinician-teacher and an assistant professor in the Division of Emergency Medicine at the University of Toronto. Her academic projects include the development of procedure videos for the New England Journal of Medicine “Videos in Clinical Medicine Series."

Justin Hall

Dr. Justin Hall is an Emergency Physician at Sunnybrook Health Sciences Centre and faculty member at the University of Toronto. He is a member of Ontario Health’s Virtual Urgent Care Provincial Evaluation Steering Committee. His scholarly interests include virtual care, technological innovation, resource stewardship, leadership training, and quality improvement.

Aaron Leung

Aaron is a fourth year medical student from Western University with an interest in emergency medicine, medical education, and POCUS. Outside of school, he enjoys playing the piano and trying out new restaurants.