Self-Taught Semi-Supervised Dictionary Learning with Non-Negative Constraint
2019
This paper investigates classification by dictionary learning. A novel unified framework termed self-taught semisupervised dictionary learning with non-negative constraint (NNST-SSDL) is proposed for simultaneously optimizing the components of a dictionary and a graph Laplacian. Specifically, an atom graph Laplacian regularization is built by using sparse coefficients to effectively capture the underlying manifold structure. It is more robust to noisy samples and outliers because atoms are more concise and representative than training samples. A non-negative constraint imposed on the sparse coefficients guarantees that each sample is in the middle of its related atoms. In this way the dependency between samples and atoms is made explicit. Furthermore, a self-taught mechanism is introduced to effectively feed back the manifold structure induced by atom graph Laplacian regularization and the supervised information hidden in unlabeled samples in order to learn a better dictionary. An efficient algorithm, combining a block coordinate descent method with the alternating direction method of multipliers is derived to optimize the unified framework. Experimental results
on several benchmark datasets show the effectiveness of the proposed model.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
37
References
6
Citations
NaN
KQI