Low‐rank approximation of tensors via sparse optimization

2018 
The goal of this paper is to find a low-rank approximation for a given tensor. Specifically, we give a computable strategy on calculating the rank of a given tensor, based on approximating the solution to an NP-hard problem. In this paper, we formulate a sparse optimization problem via an $l_1$-regularization to find a low-rank approximation of tensors. To solve this sparse optimization problem, we propose a rescaling algorithm of the proximal alternating minimization and study the theoretical convergence of this algorithm. Furthermore, we discuss the probabilistic consistency of the sparsity result and suggest a way to choose the regularization parameter for practical computation. In the simulation experiments, the performance of our algorithm supports that our method provides an efficient estimate on the number of rank-one tensor components in a given tensor. Moreover, this algorithm is also applied to surveillance videos for low-rank approximation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    40
    References
    7
    Citations
    NaN
    KQI
    []