Regression and Multiclass Classification Using Sparse Extreme Learning Machine via Smoothing Group L1/2 Regularizer

2020 
Extreme learning machine (ELM) is a simple feedforward neural network, and it has been extensively used in applications for its extremely fast learning speed and good generalization performance. Nevertheless, it is implemented normally under the empirical risk minimization scheme and the model trained by ELM is prone to overfitting. In addition, the ELM provides more nodes than it actually needs, this means that the network structure is not sparse enough. To solve above problems, two efficient algorithms for training ELM: Group $L_{1/2}$ regularization and smoothing Group $L_{1/2}$ regularization methods are proposed in this article. But, the basic group $L_{1/2}$ regularization is nondifferentiable at the origin and which causes oscillation. So, we modify the basic group $L_{1/2}$ regularization by smoothing it at the origin. Simulation results show that the ELM with smoothing group $L_{1/2}$ regularization can effectively prune redundant nodes and redundant weights of the surviving nodes, which has better performance than traditional ELM, the ELM with $L_{1}$ regularization method, and with group $L_{1/2}$ regularization method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    34
    References
    8
    Citations
    NaN
    KQI
    []