Neural network for multi-class classification by boosting composite stumps

2015 
We put forward a new model for multi-class classification problems based on the Neural Network structure. The model employs weighted linear regression for feature selection and uses boosting algorithm for ensemble learning. Unlike most previous algorithms, which need to build a collection of binary classifiers independently, the method constructs only one strong classifier once and for all classes via minimizing the total error in a forward stagewise manner. In this work, a novel weak learner framework called composite stump is proposed to improve convergence speed and share features. With these optimization techniques, the classification problem is solved by a simple but effective classifier. Experiments show that the new method outperforms the previous approaches on a number of data sets. HighlightsA novel structure is proposed to improve convergence speed and share features.An adaptive neural network model is presented for multi-class classification.Linear functions are employed as the activation functions in the model.A weighted linear regression with sparsity constraints is used for feature selection.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    42
    References
    11
    Citations
    NaN
    KQI
    []