Genetic algorithm based on new evaluation function and mutation model for training of BPNN

2012 
A local minimum is frequently encountered in the training of back propagation neural networks (BPNN), which sharply slows the training process. In this paper, an analysis of the formation of local minima is presented, and an improved genetic algorithm (GA) is introduced to overcome local minima. The Sigmoid function is generally used as the activation function of BPNN nodes. It is the flat characteristic of the Sigmoid function that results in the formation of local minima. In the improved GA, pertinent modifications are made to the evaluation function and the mutation model. The evaluation of the solution is associated with both the training error and gradient. The sensitivity of the error function to network parameters is used to form a self adapting mutation model. An example of industrial application shows the advantage of the improved GA to overcome local minima. 
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    3
    Citations
    NaN
    KQI
    []