Robust Joint Feature Weights Learning Framework

2016 
Feature selection, selecting the most informative subset of features, is an important research direction in dimension reduction. The combinatorial search in feature selection is essentially a binary optimization problem, known as NP hard, which can be alleviated by learning feature weights. Traditional feature weights algorithms rely on heuristic search path. These approaches neglect the interaction and dependency between different features, and thus provide no guarantee for optimality. In this paper, we propose a novel joint feature weights learning framework, which imposes both nonnegative and $\ell _{2,1}$ -norm constraints on the feature weights matrix. The nonnegative property ensures the physical significance of learned feature weights. Meanwhile, $\ell _{2,1}$ -norm minimization achieves joint selection of the most relevant features by exploiting the whole feature space. More importantly, an efficient iterative algorithm with proved convergence is designed to optimize a convex objective function. Using this framework as a platform, we propose new supervised and unsupervised joint feature selection methods. Particularly, in the proposed unsupervised method, nonnegative graph embedding is developed to exploit intrinsic structure in the weighted space. Comparative experiments on seven real-world data sets indicate that our framework is both effective and efficient.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []