A novel global training algorithm and its convergence theorem for fuzzy neural networks

1995 
In this paper, a new global optimizing algorithm that combines the modified quasi-Newton method and the improved genetic algorithm is proposed to find the global minimum of the total error function of a fuzzy neural network. A global linear search algorithm based on fuzzy logic and combinatorial interpolation techniques is developed in the modified quasi-Newton model. It is shown that this algorithm ensures convergence to a global minimum with probability 1 in a compact region of a weight vector space. The results of computer simulations also reveal that this algorithm has a better convergence property and the times of global search are obviously decreased.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    5
    References
    4
    Citations
    NaN
    KQI
    []