Optimized Computation Combining Classification and Detection Networks with Distillation

2021 
A Convolutional neural network (CNN) has emerged as a widely used approach to computer vision tasks, including object classification and detection tasks. The high requirement for the model to be more computationally efficient on lower information and communication technology (ICT) resource, e.g., mobile terminals can benefit from model distillation. However, most existing distillation methods suffer from a significant accuracy reduction, which requires a large number of pre-training models or doesn't make good use of the more of the network information, e.g., in the middle layers, during the distillation. In this paper, we study how knowledge about traffic signs recognition could be transferred to smaller models by distillation while cutting channels. We present an optimized object detection network, which uses a Region Proposal Network (RPN) weighted loss and hard-soft distribution-wise distillation loss for structural differences between teacher and student networks. We validate the network on multiple real-world datasets, the experiments demonstrate that the classification accuracy can be improved by 9 % with about 16 times parameter reduction while the detection network performance could be increased by 10.6% using an optimized object detection network.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    25
    References
    0
    Citations
    NaN
    KQI
    []