Robust Sparse Coding via Self-Paced Learning for Data Representation

2020 
Abstract Sparse coding (SC), due to its thorough theoretical property and outstanding effectiveness, is attracting more and more attention in various data representation and data mining applications. However, the optimization of most existing sparse coding algorithms are non-convex and thus prone to become stuck into bad local minima under the framework of alternative optimization, especially when there are many outliers and noisy data. To enhance the learning robustness, in this study, we will present an unified framework named Self- PacedSparseCoding (SPSC), which gradually includes data into the learning process of SC from easy ones to complex ones by incorporating self-paced learning methodology. It implements a soft instance selection accordingly rather than a heuristic hard strategy sample selection. We also generalize the self-paced learning schema into different levels of dynamic selection on instances, features and elements respectively. Further, we show an optimization algorithm to solve it and a theoretical explanation to analyze the effectiveness of it. Extensive experimental results on the real-world clean image datasets and images with two kinds of corruptions demonstrate the remarkable robustness of the proposed method for high dimensional data representation on image clustering and reconstruction tasks over the state-of-the-arts.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    33
    References
    1
    Citations
    NaN
    KQI
    []