Low-Rank Tensor Completion Method for Implicitly Low-Rank Visual Data
2022
The existing low-rank tensor completion methods develop many tensor decompositions and corresponding tensor ranks in order to reconstruct the missing information by exploiting the inherent low-rank structure under the assumption that the data is low-rank under one of the kinds of decompositions. However, the assumption is easily violated for real-world data, e.g., color images and multispectral images, as the low-rank structure of these data is not significant. To better take advantage of the global correlation relationship, we propose a kernel low-rank tensor completion model, where original data is mapped into the feature space using a kernel mapping. Although the original data is high-rank, it is low-rank in the feature space owing to the kernel mapping. Therefore, the proposed model could take advantage of the implicitly low-rank structure in the feature space and estimate the missing entries well. Considering it is not easy to explicitly kernelize the tensor, we reformulate the model as the inner product form and introduce the kernel trick for efficiently solving the resulting model. Extensive experiments on color images and multispectral images show that the proposed method outperforms the state-of-the-art low-rank tensor completion methods.
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
38
References
0
Citations
NaN
KQI