Deep convolutional neural network based on densely connected squeeze-and-excitation blocks

2019 
Convolutional neural networks have achieved great successes in many visual tasks, as well as a good performance in various applications. However, research has yet to solve the practical problem of how to improve the recognition rate by increasing the depth of the network. The method proposed in this paper is based on the idea of a residual network, which shows that if a fast connection is added between network layers, the network will be deeper, the accuracy will be higher, and the training will be more efficient. The proposed Densely Squeeze-and-Excitation Network (DSENet) combines densely connected networks with the structure of a Squeeze-and-Excitation Network (SENet). The proposed structure is based on layers comprising blocks of compressed networks, and DSENet enables each of these layers to connect directly to all previous layers with adaptive weights. DSENet has many advantages, such as a high utilization rate of parameters and the ability to alleviate the disappearance of a gradient. The performance of the proposed network is evaluated on the CIFAR-10 and CIFAR-100 datasets, and experiments show that DSENet achieves better results on those datasets when compared with the most popular convolutional networks in current use.Convolutional neural networks have achieved great successes in many visual tasks, as well as a good performance in various applications. However, research has yet to solve the practical problem of how to improve the recognition rate by increasing the depth of the network. The method proposed in this paper is based on the idea of a residual network, which shows that if a fast connection is added between network layers, the network will be deeper, the accuracy will be higher, and the training will be more efficient. The proposed Densely Squeeze-and-Excitation Network (DSENet) combines densely connected networks with the structure of a Squeeze-and-Excitation Network (SENet). The proposed structure is based on layers comprising blocks of compressed networks, and DSENet enables each of these layers to connect directly to all previous layers with adaptive weights. DSENet has many advantages, such as a high utilization rate of parameters and the ability to alleviate the disappearance of a gradient. The performan...
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    6
    Citations
    NaN
    KQI
    []