Article contents
Validity and reliability of an in-training evaluation report to measure the CanMEDS roles in emergency medicine residents
Published online by Cambridge University Press: 04 March 2015
Abstract
There is a question of whether a single assessment tool can assess the key competencies of residents as mandated by the Royal College of Physicians and Surgeons of Canada CanMEDS roles framework.
The objective of the present study was to investigate the reliability and validity of an emergency medicine (EM) in-training evaluation report (ITER).
ITER data from 2009 to 2011 were combined for residents across the 5 years of the EM residency training program. An exploratory factor analysis with varimax rotation was used to explore the construct validity of the ITER. A total of 172 ITERs were completed on residents across their first to fifth year of training.
A combined, 24-item ITER yielded a five-factor solution measuring the CanMEDs role Medical Expert/ Scholar, Communicator/Collaborator, Professional, Health Advocate and Manager subscales. The factor solution accounted for 79% of the variance, and reliability coefficients (Cronbach alpha) ranged from α = 0.90 to 0.95 for each subscale and α = 0.97 overall. The combined, 24-item ITER used to assess residents’ competencies in the EM residency program showed strong reliability and evidence of construct validity for assessment of the CanMEDS roles.
Further research is needed to develop and test ITER items that will differentiate each CanMEDS role exclusively.
- Type
- Education • Enseignement
- Information
- Copyright
- Copyright © Canadian Association of Emergency Physicians 2014
References
REFERENCES
- 5
- Cited by