Gate Trimming: One-Shot Channel Pruning for Efficient Convolutional Neural Networks
2021
Channel pruning is a promising technique of model compression and acceleration because it reduces the space and time complexity of convolutional neural networks (CNNs) while maintaining their performance. In existing methods, channel pruning is performed by iterative optimization or training with sparsity-induced regularization, which all undermine the utility due to their inefficiency. In this work, we propose a one-shot global pruning approach called Gate Trimming (GT), which is more efficient to compress the CNNs. To achieve this, GT performs the pruning operation once, avoiding expensive retraining or re-evaluation of channel redundancy. In addition, GT globally estimates the effect of channels across all layers by information gain (IG). Based on the IG of channels, GT accurately prunes the redundant channels and makes little negative effect on CNNs. The experimental results show that the proposed GT is superior to the state-of-the-art methods.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
21
References
0
Citations
NaN
KQI