Global convergence of Negative Correlation Extreme Learning Machine

2021 
Ensemble approaches introduced in the Extreme Learning Machine literature mainly come from methods that rely on data sampling procedures, under the assumption that the training data are heterogeneously enough to set up diverse base learners. To overcome this assumption, it was proposed an ELM ensemble method based on the Negative Correlation Learning framework, called Negative Correlation Extreme Learning Machine (NCELM). This model works in two stages: (i) different ELMs are generated as base learners with random weights in the hidden layer, and (ii) a NCL penalty term with the information of the ensemble prediction is introduced in each ELM minimization problem, updating the base learners, (iii) second step is iterated until the ensemble converges. Although this NCL ensemble method was validated by an experimental study with multiple benchmark datasets, no information was given on the conditions about this convergence. This paper mathematically presents sufficient conditions to guarantee the global convergence of NCELM. The update of the ensemble in each iteration is defined as a contraction mapping function, and through Banach theorem, global convergence of the ensemble is proved.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    29
    References
    0
    Citations
    NaN
    KQI
    []