Crowdsourcing of Histological Image Labeling and Object Delineation by Medical Students

2019 
Crowdsourcing in pathology has been performed on tasks that are assumed to be manageable by nonexperts. Demand remains high for annotations of more complex elements in digital microscopic images, such as anatomical structures. Therefore, this work investigates conditions to enable crowdsourced annotations of high-level image objects, a complex task considered to require expert knowledge. 76 medical students without specific domain knowledge who voluntarily participated in three experiments solved two relevant annotation tasks on histopathological images: (1) Labeling of images showing tissue regions, and (2) delineation of morphologically defined image objects. We focus on methods to ensure sufficient annotation quality including several tests on the required number of participants and on the correlation of participants’ performance between tasks. In a set up simulating annotation of images with limited ground truth, we validated the feasibility of a confidence score using full ground truth. For this, we computed a majority vote using weighting factors based on individual assessment of contributors against scattered gold standard annotated by pathologists. In conclusion, we provide guidance for task design and quality control to enable a crowdsourced approach to obtain accurate annotations required in the era of digital pathology.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    37
    References
    12
    Citations
    NaN
    KQI
    []