Benchmarking the attainment of clinical competence

2014 
Background: The sharing of co-developed Objective Structured Clinical Examinations (OSCEs) by medical schools for the assessment of clinical competence could serve as a responsive and flexible means of entrenching assurance of high quality assessment processes. It has the added benefit of benchmarking and evaluating comparable competence standards. Summary of Work: This study was undertaken to explore the feasibility of utilising shared OSCEs to benchmark clinical performance of students in four geographically dispersed Australian medical schools. Four shared OSCE stations were co-developed by the participating medical schools and embedded in the end of year examinations for the assessment of clinical performance in the early clinical phase of the course. Returned checklist, global and total scores from 1670 student results were then analysed using SAS analytical package to compare mean scores and clinical competence levels. Summary of Results: Data analysis revealed similar clinical competence patterns in the performance of the medical students, indicating comparable standards. The degree of difficulty for the shared OSCE stations was largely similar for participating schools, although mean total student scores varied between schools. Conclusions: Benefits of benchmarking are available to collaborating medical schools through identification of common curriculum areas requiring specific focus and the sharing of assessment approaches. Similarly, relative underperformance by a school in a particular assessment item may indicate deficiency that can be remediated in order to achieve comparability with its peers. Take-home Messages: Sharing of assessment materials can provide common defensible, reliable, valid, robust and standardised assessments which in turn, enhance transparency and accountability.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []