Association Between USMLE Step 1 Scores and In-Training Examination Performance: A Meta-Analysis.

2021 
PURPOSE On February 12, 2020, the sponsors of the United States Medical Licensing Examination announced that Step 1 will transition to pass/fail scoring in 2022. Step 1 performance has historically carried substantial weight in the evaluation of residency applicants and as a predictor of subsequent subject-specific medical knowledge. Using a systematic review and meta-analysis, the authors sought to determine the association between Step 1 scores and in-training examination (ITE) performance, which is often used to assess knowledge acquisition during residency. METHOD The authors systematically searched Medline, EMBASE, and Web of Science for observational studies published from 1992 through May 10, 2020. Observational studies reporting associations between Step 1 and ITE scores, regardless of medical or surgical specialty, were eligible for inclusion. Pairs of researchers screened all studies, evaluated quality assessment using a modified Newcastle-Ottawa Scale, and extracted data in a standardized fashion. The primary endpoint was the correlation of Step 1 and ITE scores. RESULTS Of 1,432 observational studies identified, 49 were systematically reviewed and 37 were included in the meta-analysis. Overall study quality was low to moderate. The pooled estimate of the correlation coefficient was 0.42 (95% confidence interval [CI] 0.36, 0.48; P < .001), suggesting a weak-to-moderate positive correlation between Step 1 and ITE scores. The random-effects meta-regression found the association between Step 1 and ITE scores was weaker for surgical (versus medical) specialties (beta -0.25 [95% CI -0.41, -0.09; P = .003]) and fellowship (versus residency) training programs (beta -0.25 [95% CI -0.47, -0.03; P = .030]). CONCLUSIONS The authors identified a weak-to-moderate positive correlation between Step 1 and ITE scores based on a meta-analysis of low-to-moderate quality observational data. With Step 1 scoring transitioning to pass/fail, the undergraduate and graduate medical education communities should continue to develop better tools for evaluating medical students.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []