Vessel Ligation Fundamentals: A Comparison of Technical Evaluations by Crowdsourced Nonclinical Personnel and Surgical Faculty

2017 
Background Evaluation of fundamental surgical skills is invaluable to the training of medical students and junior residents. This study assessed the effectiveness of crowdsourcing nonmedical personnel to evaluate technical proficiency at simulated vessel ligation. Study design Fifteen videos were captured of participants performing vessel ligation using a low-fidelity model (5 attending surgeons and 5 medical students before and after training). These videos were evaluated by nonmedical personnel recruited through Amazon Mechanical Turk, as well as by 3 experienced surgical faculty. Evaluation criteria were based on Objective Structured Assessment of Technical Skills (scale: 5-25). Results were compared using Wilcoxon signed rank-sum and Cronbach’s alpha ( α ). Results Thirty-two crowd workers evaluated all 15 videos. Crowd workers scored attending surgeon videos significantly higher than pretraining medical student videos (20.5 vs 14.9, p α = 0.95) than the strength of correlation between any 2 individual expert evaluators ( α = 0.72-0.88). Combined reimbursement for all workers was $80.00. Conclusion After adjustments for score inflation, crowdsourced can evaluate surgical fundamentals with excellent validity. This resource is considerably less costly and potentially more reliable than individual expert evaluations.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    32
    References
    7
    Citations
    NaN
    KQI
    []