A new initialization method for artificial neural networks: Laplacian

2018 
Artificial neural networks' popularity in the field of machine learning increases day by day since 2006, foundation date of deep learning. One of the factors which greatly affects the success percentages of deep neural networks is their initialization. In this article, new initialization methods based on Laplacian distribution is proposed. With the use of these new initialization methods, it is aimed to assign appropriate initial values to the network parameters so as to better train the network. Results of our methods on University of California, Irvine (UCI) Human Activity Recognition and CIFAR-10 datasets are compared with the networks which are initialized with well-known methods, such as Gaussian and Uniform initialization, while network formation and layer structure are left unchanged. With this comparison, the advantages of Laplacian-based initialization methods compared to existing methods were discussed considering the test success.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    15
    References
    2
    Citations
    NaN
    KQI
    []