Suitable Activity Function of Neural Networks for Data Enlargement

2018 
In this paper, we present a study on activity functions for a multi-layered neural networks (MLNNs) and propose a suitable activity function for data enlargement (DE). We have carefully studied the training performance of Sigmoid, ReLu, Leaky-ReLu and L&exp. activity functions for three inputs to multiple output training patterns. Our MLNNs model has L hidden layers with two inputs to four or six outputs by backpropagation neural network training (BP). We focused on the multi teacher training signals to investigate and evaluate the training performance in MLNNs and select the best and good activity function for data enlargement and hence could be applicable for image and signal processing (synaptic divergence). We specifically used four activity functions from which we found out that L & exp. activity function can suite data enlargement neural network training (DENN) since it could give the highest percentage training abilities compared to the other activity functions of Sigmoid, ReLu and Leaky-ReLu during simulation and training of data in the network. And finally, we recommend L&exp. function to be good for MLNNs and may be applicable for signal processing of data and information enlargement because of its performance training characteristics with multiple training logic patterns hence convolution neural networks (CNN).
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []