Neural network for multi-class classification by boosting composite stumps

2015 
We put forward a new model for multi-class classification problems based on the Neural Network structure. The model employs weighted linear regression for feature selection and uses boosting algorithm for ensemble learning. Unlike most previous algorithms, which need to build a collection of binary classifiers independently, the method constructs only one strong classifier once and for all classes via minimizing the total error in a forward stagewise manner. In this work, a novel weak learner framework called composite stump is proposed to improve convergence speed and share features. With these optimization techniques, the classification problem is solved by a simple but effective classifier. Experiments show that the new method outperforms the previous approaches on a number of data sets.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []