Attention Guided Slit Lamp Image Quality Assessment.

2021 
Learning human visual attention into a deep convolutional network contributes to classification performance improvement. In this paper, we propose a novel attention-guided architecture for image quality assessment (IQA) of slit lamp images. Its characteristics are threefold: First, we build a two-branch classification network, where the input of one branch uses masked images to learning regional prior. Second, we use a Forward Grad-CAM (FG-CAM) to represent the attention of each branch and generate the saliency maps. Third, we further design an Attention Decision Module (ADM) to decide which part of the gradient flow of both two branch saliency maps will be updated. The experiments on 23,197 slit lamp images show that the proposed method allows the network closer to human visual attention compared with other state-of-the-art methods. Our method achieves 97.41%, 84.79%, 92.71% on AUC, F1-score and accuracy, respectively. The code is open accessible: https://github.com/nhoddJ/CSRA-module.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    17
    References
    0
    Citations
    NaN
    KQI
    []