Convolutional neural network simplification with progressive retraining

2021 
Kernel pruning methods have been proposed to speed up (simplify) convolutional neural network (CNN) models. However, the effectiveness of a simplified model is often below the original one. This letter presents new methods based on objective and subjective relevance criteria for kernel elimination in a layer-by-layer fashion. During the process, a CNN model is retrained only when the current layer is entirely simplified by adjusting the weights from the next layer to the first one and preserving weights of subsequent layers not involved in the process. We call this strategy , differently from kernel pruning methods that usually retrain the entire model after eliminating one or a few kernels. Our subjective relevance criterion exploits humans’ ability to recognize visual patterns and improve the designer’s understanding of the simplification process. We show that our methods can increase effectiveness with considerable model simplification, outperforming two popular approaches and another one from the state-of-the-art on four challenging image datasets. An indirect comparison with 14 recent methods on a famous image dataset also places our approach using the objective criterion among the most competitive ones.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []