Integrated Sparse Coding With Graph Learning for Robust Data Representation

2020 
Sparse coding is a popular technique for achieving compact data representation and has been used in many applications. However, the instability issue often causes degeneration in practice and thus attracts a lot of studies. While the traditional graph sparse coding preserves the neighborhood structure of the data, this study integrates the low-rank representation(LRR) to fix the inconsistency of sparse coding by holding the subspace structures of the high-dimensional observations. The proposed method is dubbed low-rank graph regularized sparse coding (LogSC), which learns sparse codes and low-rank representations jointly rather than the traditional two-step approach. Since the two data representations share a dictionary matrix, the resulted sparse representation on this dictionary could be benefited from LRR. We solved the optimization problem of LogSC by using the linearized alternating direction method with adaptive penalty. Experimental results show the proposed method is discriminative in feature learning and robust to various noises. This work provides a one-step approach to integrating graph embedding in representation learning.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    1
    Citations
    NaN
    KQI
    []