Training neural network with chaotic learning rate

2011 
Local minimum is incorporated problem in neural network (NN) training. To alleviate this problem, a modification of standard backpropagation (BP) algorithm, called BPCL for training NN is proposed. When local minimum arrives in the training, the weights of NN become idle. If the chaotic variation of learning rate (LR) is included during training, the weight update may be accelerated in the local minimum zone. In addition, biological NN involves chaos. That is why, BPCL generates a chaotic time series with logistic map and a rescaled version of the series is used as LR during BP training. BPCL is tested on six real world benchmark classification problems such as breast cancer, diabetes, heart disease, Australian credit card, horse and glass. BPCL outperforms BP in terms of generalization ability and also convergence rate.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    20
    References
    6
    Citations
    NaN
    KQI
    []