Full Representation Data Embedding via Nonoverlapping Historical Features
2018
Data recycling, which reuses the historical data to assist the present data to achieve better performance, is an emerging and important research topic. A common case is that historical examples only have features from one source while presently have more data collection ways and extract different types of features simultaneously for new examples. Previous studies assume that either historical data appear in all sources, or at least there is one type of representations for all data. In this paper, we study the challenging problem in the above common case and propose a novel semisupervised approach by leveraging nonoverlapping historical features (NHFs). It learns full representations of both historical features and present features in a latent subspace. We utilize the intrinsic geometrical structure of all data and add the label information of historical data as a hard constraint to discover a latent subspace. Then, the classification will be performed with these new representations. Moreover, we provide an efficient algorithm to solve the formulated optimization problem with proved convergence behavior, together with some insightful discussions about parameter determination. Experimental results on real-world data sets are provided to examine the effectiveness of our algorithm. Furthermore, we have also evaluated our method in face recognition. They all demonstrate the effectiveness of our proposed approach on recycling NHFs.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
45
References
2
Citations
NaN
KQI