ASKs: Convolution with Any-Shape Kernels for Efficient Neural Networks

2021 
Abstract Despite the outstanding performance, deep convolutional neural networks (CNNs) are computationally expensive and contain a large number of redundant parameters, hindering their deployment on resource constrained platforms. To address this issue, many model compression methods have been proposed. However, these methods mainly focus on pruning redundant parameters or designing efficient architectures, the redundancy in convolution kernels has rarely been investigated. In this paper, we find that the contributions of parameters at different locations in the traditional 3 × 3 kernels are not the same, and this distribution varies considerably in different layers. Motivated by this, we propose to use irregular kernels and present a novel approach to implementing convolution with any-shape kernels (ASKs) efficiently. The proposed ASKs are plug-and-play and can be readily embedded into existing CNNs, providing efficient modules for building compact CNNs. Experiments on benchmarks demonstrate the effectiveness of the proposed method. We improve the accuracy of VGG-16 on CIFAR-10 dataset from 93.45% to 94.04% simply by replacing the regular 3 × 3 kernel with cross-shaped kernel, which takes up only about 5 / 9 of the original storage and computing resources. Compared to state-of-the-art model compression methods, our ASKs achieve a better trade-off between accuracy and compression ratio.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    48
    References
    1
    Citations
    NaN
    KQI
    []