One-stage object detection with graph convolutional networks

2021 
The task of Object Detection is to find all the objects of interest in the image and determine their categories and locations. The dominant deep learning-based object detection methods usually regard objects as isolated individuals and ignore the relationship between the objects, which limits the accuracy of the object detection model. There is some work that attaches relationships between categories to candidate proposal regions and proves that the relationship improves the accuracy of object detection, but these methods are all operations of the feature map. In this paper, we propose a correlation complement (CC) module that combines the class representation vector with the relationships between categories in the dataset. Experimental results on multiple object detection datasets prove the effectiveness of our module. In addition, this model is extensible and can be added to other one-stage object detection methods. The task of Object Detection is to find all the objects of interest in the image and determine their categories and locations. The dominant deep learning-based object detection methods usually regard objects as isolated individuals and ignore the relationship between the objects, which limits the accuracy of the object detection model. There is some work that attaches relationships between categories to candidate proposal regions and proves that the relationship improves the accuracy of object detection, but these methods are all operations of the feature map. In this paper, we propose a correlation complement (CC) module that combines the class representation vector with the relationships between categories in the dataset. Experimental results on multiple object detection datasets prove the effectiveness of our module. In addition, this model is extensible and can be added to other one-stage object detection methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []