Effects of Different Normalization Techniques on the Convolutional Neural Network

2021 
Normalization is an essential part of the deep neural network. With the help of the normalization technique, the processing speed of the system becomes fast. But, if we choose wrong normalization technique, then we do not achieve best accuracy. For this purpose, we check the effects of the batch normalization, cross channel normalization, and dropout techniques on the Convolutional Neural Network (CNN). We used batch normalization technique with the help of never shuffle option, once shuffle option and every-epoch shuffle option. If we use the batch normalization technique with the never shuffle option's help, then we found 31% accuracy. With the help of the once shuffle option, we found 100% accuracy, and with the help of the every-epoch shuffle option, we found 94.67% accuracy. For cross-channel normalization technique, we found 100% accuracy. For dropout technique with the probability of 0.9, 51% accuracy was obtained. With the probability of 0.7, the accuracy was 97%, and for the probability less than 0.7, we found 100% accuracy. We checked all of these results with the use of the cipher-10 dataset on the VGG-16 neural network.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    23
    References
    0
    Citations
    NaN
    KQI
    []