A New Algorithm for Initialization and Training of Beta Multi-Library Wavelets Neural Network

2008 
The resolutions of neurons networks training problems by gradient are characterized by their noticed inability to escape of local optima [Mich93], [Fabr94] and in a least measure by their slowness [Wess92], [Zhan92]. The evolutionist algorithms bring in some domains a big number of solutions: practice of networks to variable architecture [With90], automatic generation of Booleans neurons networks for the resolution of a class of optimization problems [Grua93]. However the effort of research was especially carried on the generation and the discreet network training. In this chapter, we propose a new algorithm of wavelets networks training, based on gradient that requires: • A set of training examples: the wavelets networks are parametrables functions, used to achieve statistical models from examples (in the case of classification) or of measures (in the case of modeling); their parameters are calculated from these examples or couples {input, output}. • The definition of a cost function that measures the gap between the input of the wavelets network and the desired output (in the case of classification) or the measured values (in case of modeling) present on the set of training. • A minimization algorithm of the cost function. • An algorithm of selection of basic function to initialize the network parameters. We try then to show the importance of initialization of the network parameters. Since the output is non linear in relation to these parameters, the cost function can present local minima, and the training algorithms don't give any guarantee to find the global minimum. We note that if we have a good initialization, the local minimum problem can be avoided, it is sufficient to select the best regressions (the best based on the training data) from a finished set of regressors. If the number of regressors is insufficient, not only some local minima appear, but also, the global minimum of the cost function doesn't necessarily correspond to the values of the searched parameters, it is useless then in this case to put an expensive algorithm to look for the global minimum. With a good initialization of the network parameters the efficiency of training increases. A very important factor that it is necessary to underline is: whatever the chosen algorithm, the quality of training wavelets networks is as much better than we have an optimal initialization.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    19
    References
    6
    Citations
    NaN
    KQI
    []