Validity and reliability of an in-training evaluation report to measure the CanMEDS roles in emergency medicine residents

2014 
BACKGROUND: There is a question of whether a single assessment tool can assess the key competencies of residents as mandated by the Royal College of Physicians and Surgeons of Canada CanMEDS roles framework. OBJECTIVE: The objective of the present study was to investigate the reliability and validity of an emergency medicine (EM) in-training evaluation report (ITER). METHOD: ITER data from 2009 to 2011 were combined for residents across the 5 years of the EM residency training program. An exploratory factor analysis with varimax rotation was used to explore the construct validity of the ITER. A total of 172 ITERs were completed on residents across their first to fifth year of training. RESULTS: A combined, 24-item ITER yielded a five-factor solution measuring the CanMEDs role Medical Expert/Scholar, Communicator/Collaborator, Professional, Health Advocate and Manager subscales. The factor solution accounted for 79% of the variance, and reliability coefficients (Cronbach alpha) ranged from α  =  0.90 to 0.95 for each subscale and α  =  0.97 overall. The combined, 24-item ITER used to assess residents' competencies in the EM residency program showed strong reliability and evidence of construct validity for assessment of the CanMEDS roles. CONCLUSION: Further research is needed to develop and test ITER items that will differentiate each CanMEDS role exclusively.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    7
    Citations
    NaN
    KQI
    []