Symmetric Rectified Linear Units for Fully Connected Deep Models.

2018 
Rectified Linear Units (ReLU) is one of the key aspects for the success of Deep Learning models. It has been shown that deep networks can be trained efficiently using ReLU without pre-training. In this paper, we compare and analyze various kinds of ReLU variants in fully-connected deep neural networks. We test ReLU, LReLU, ELU, SELU, mReLU and vReLU on two popular datasets: MNIST and Fashion-MNIST. We find vReLU, a symmetric ReLU variant, shows promising results in most experiments. Fully-connected networks (FCN) with vReLU activation are able to achieve a higher accuracy. It achieves relative improvement in test error rate of 39.9% compared to ReLU on MNIST dataset; and achieves relative improvement of 6.3% compared to ReLU on Fashion-MNIST dataset.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    14
    References
    1
    Citations
    NaN
    KQI
    []