Fast Converging Magnified Weighted Sum Backpropagation

2021 
In backpropagation (BP), Neuron’s output drops either into weakly committed or strongly committed zone. Neuron’s weighted sum, referred as a net, if close to zero, neuron is weakly committed, otherwise it is strongly committed. To push the weakly committed neurons in strongly committed zone, additional iterations are required, which causes the poor convergence rate. In this manuscript, the weighted sum entity of the backpropagation is magnified. This variant of the backpropagation is referred as a magnified weighted sum backpropagation (MNBP) algorithm. This net enlarging process of the MNBP makes sure that the neuron produces output in strongly committed space. As the net is magnified, it is gradient and is also magnified. It is noted here that MNBP needs lesser number of epochs for convergence unlike standard backpropagation. But, it may arise the flat spot issue. Hence, the flat spot problem is also studied and the appropriate majors are taken in the proposed algorithm to solve this problem. The implementations are carried out on parity (two bit, three bit and five bit) problem, encoder problem and standard benchmark problems. The outcomes are matched with the standard BP and its two variants named Fahlman approach and MGFPROP. Based on experimentation carried out here, it is concluded that the MNBP needs small amount of epochs for its convergence dissimilar to the standard BP and its two variants.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    8
    References
    0
    Citations
    NaN
    KQI
    []