Feature Selection for Neural Networks Using Group Lasso Regularization

2019 
We propose an embedded/integrated feature selection method based on neural networks with Group Lasso penalty. Group Lasso regularization is considered to produce sparsity on the inputs to the network, i.e., for selection of useful features. Lasso based feature selection using a multi-layer perceptron usually requires an additional set of weights, while our Group Lasso formulation does not require that. However, the Group Lasso penalty is non-differentiable at the origin. This may lead to oscillations in numerical experiments and make it difficult to analyze theoretically. To address this issue, four smoothing Group Lasso penalties are discussed. A rigorous proof for the convergence of the proposed algorithm is presented under suitable assumptions. To verify the effectiveness, a three-step algorithmic architecture is adopted in implementation. Experimental results on several datasets validate the theoretical results and demonstrate good performance of the proposed method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    66
    References
    30
    Citations
    NaN
    KQI
    []