57. The power of power: Comparative evaluations of medical residency training across teaching sites and programs at the University of Torontos

C. Abrahams, S. Verma, L. Muharuma, K. Imrie, R. Vestemean, P. Poldre, J. McIlroy, N. Woods


To meet accountability and accreditation requirements, teaching partners and the faculty postgraduate office required more robust and integrated feedback on teaching and assessment. The web-based evaluation system known as POstgraduate Web Evaluation and Registration (POWER) was implemented in 2004/05 by most residency training programs, using their existing forms and scoring scales. At start up, over 250 different evaluation forms and 85 varying scoring scales were in operation across programs for the In-Training Evaluation Reports (ITERs) and resident-completed evaluations for Rotation Evaluation Scores (RES) and Teaching Effectiveness Scores (TES).
The POWER Evaluation Working Group was formed to develop a methodology to gather and consolidate evaluations to report on medical residents, their teachers, and rotations in a clear, consistent user-friendly format, map general questions against CanMEDS roles and Family Medicine principles, and convert all scoring scales to a consistent 5 point Likert scale. A standardized naming protocol was developed to map rotation services to individual teaching sites.
The 2004/05 analysis of these evaluations (2004/05 Annual POWER Report: Lessons Learned) provide baseline data to begin monitoring trends in resident and faculty performance, assess the quality of programs and identify areas for improvement by CanMEDS standards and CFPC principles. Mean scores, standard deviations and number of evaluations were presented by teaching site and program.
Consolidation of evaluations by program and teaching site provides valuable feedback to hospitals and programs wishing to standardize and improve their assessment systems, and to postgraduate medical offices who must maintain evaluation standards and illustrate trends for accreditation purposes. Future activities include: standardizing evaluation forms starting July 2007, improving scoring consistency and accuracy, improve participation rates and timeliness of responses, develop a procedure/case log tracking system, and trend analysis.
Afrin LB, Arana GW, Medio FJ, Ybarra AF, Clarke HS Jr. Improving oversight of the graduate medical education enterprise: one institution’s strategies and tools. Academic Medicine 2006 (May); 81(5):419-25.
Benjamin S, Robbins LI, Kung S. Online Sources for assessment and evaluation. Academic Psychiatry 2006 (Nov-Dec); 30(6):498-504.
Rosenberg ME, Watson K, Paul J Miller W, Harris I, Valdivia TD. Development and Implementation of a web-based evaluation system for an internal medicine residency program. Academic Medicine 2001 (Jan); 76(1):92-5.

DOI: http://dx.doi.org/10.25011/cim.v30i4.2818


  • There are currently no refbacks.

© 2007-2017 Canadian Society for Clinical Investigation.
C.I.M. provides open access to all of its content 6 months after the date of publication