Post-exam feedback with question rationales improves re-test performance of medical students on a multiple-choice exam

2018 
This study compared the effects of two types of delayed feedback (correct response or correct response + rationale) provided to students by a computer-based testing system following an exam. The preclinical medical curriculum at the University of Kansas Medical Center uses a two-exam system for summative assessments in which students test, revisit material, and then re-test (same content, different questions), with the higher score used to determine the students’ grades. Using a quasi-experimental design and data collected during the normal course of instruction, test and re-test scores from midterm multiple choice examinations were compared between academic year (AY) 2015–2016, which provided delayed feedback with the correct answer only, and AY 2016–2017, where delayed feedback consisted of the correct answer plus a rationale. The average increase in score on the re-test was 2.29 ± 6.83% (n = 192) with correct answer only and 3.92 ± 7.12% (n = 197) with rationales (p < 0.05). The effect of the rationales was not different in students of differing academic abilities based on entering composite MCAT scores or Year 1 GPA. Thus, delayed feedback with exam question rationales resulted in a greater increase in exam score between the test and re-test than feedback with correct response only. This finding suggests that delayed elaborative feedback on a summative exam produced a small, but significant, improvement in learning, in medical students.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    23
    References
    5
    Citations
    NaN
    KQI
    []