Robust Low-rank Tensor Decomposition with the L2 Criterion
0
Citation
0
Reference
10
Related Paper
Abstract:
The growing prevalence of tensor data, or multiway arrays, in science and engineering applications motivates the need for tensor decompositions that are robust against outliers. In this paper, we present a robust Tucker decomposition estimator based on the L2 criterion, called the Tucker-L2E. Our numerical experiments demonstrate that Tucker-L2E has empirically stronger recovery performance in more challenging high-rank scenarios compared with existing alternatives. The appropriate Tucker-rank can be selected in a data-driven manner with cross-validation or hold-out validation. The practical effectiveness of Tucker-L2E is validated on real data applications in fMRI tensor denoising, PARAFAC analysis of fluorescence data, and feature extraction for classification of corrupted images.Keywords:
Rank (graph theory)
The low rank tensor approximation problem (LRTAP) is to find a tensor whose rank is small and that is close to a given one. This paper studies the LRTAP when the tensor to be approximated is close to a low rank one. Both symmetric and nonsymmetric tensors are discussed. We propose a new approach for solving the LRTAP. It consists of three major stages: i) Find a set of linear relations that are approximately satisfied by the tensor; such linear relations can be expressed by polynomials and can be found by solving linear least squares. ii) Compute a set of points that are approximately common zeros of the obtained polynomials; they can be found by computing Schur decompositions. iii) Construct a low rank approximating tensor from the obtained points; this can be done by solving linear least squares. Our main conclusion is that if the given tensor is sufficiently close to a low rank one, then the computed tensor is a good enough low rank approximation. This approach can also be applied to efficiently compute low rank tensor decompositions, especially for large scale tensors.
Rank (graph theory)
Least-squares function approximation
Cite
Citations (0)
Rank (graph theory)
Cite
Citations (60)
In this paper, we propose a new approach to solve low-rank tensor completion and robust tensor PCA. Our approach is based on some novel notion of (even-order) tensor ranks, to be called the M-rank, the symmetric M-rank, and the strongly symmetric M-rank. We discuss the connections between these new tensor ranks and the CP-rank and the symmetric CP-rank of an even-order tensor. We show that the M-rank provides a reliable and easy-computable approximation to the CP-rank. As a result, we propose to replace the CP-rank by the M-rank in the low-CP-rank tensor completion and robust tensor PCA. Numerical results suggest that our new approach based on the M-rank outperforms existing methods that are based on low-n-rank, t-SVD and KBR approaches for solving low-rank tensor completion and robust tensor PCA when the underlying tensor has low CP-rank.
Rank (graph theory)
Cite
Citations (0)
In this paper, we propose a new approach to solve low-rank tensor completion and robust tensor PCA. Our approach is based on some novel notion of (even-order) tensor ranks, to be called the M-rank, the symmetric M-rank, and the strongly symmetric M-rank. We discuss the connections between these new tensor ranks and the CP-rank and the symmetric CP-rank of an even-order tensor. We show that the M-rank provides a reliable and easy-computable approximation to the CP-rank. As a result, we propose to replace the CP-rank by the M-rank in the low-CP-rank tensor completion and robust tensor PCA. Numerical results suggest that our new approach based on the M-rank outperforms existing methods that are based on low-n-rank, t-SVD, and KBR approaches for solving low-rank tensor completion and robust tensor PCA when the underlying tensor has low CP-rank.
Rank (graph theory)
Cartesian tensor
Cite
Citations (13)
We develop a systematic way to solve linear equations involving tensors of arbitrary rank. We start off with the case of a rank $3$ tensor, which appears in many applications, and after finding the condition for a unique solution we derive this solution. Subsequently we generalize our result to tensors of arbitrary rank. Finally we consider a generalized version of the former case of rank $3$ tensors and extend the result when the tensor traces are also included.
Rank (graph theory)
Invariants of tensors
Tensor calculus
Cite
Citations (1)
The tensor rank and border rank of the $3 \times 3$ determinant tensor is known to be $5$ if characteristic is not two. In this paper, we show that the tensor rank remains $5$ for fields of characteristic two as well. We also include an analysis of $5 \times 5$ and $7 \times 7$ determinant and permanent tensors, as well as the symmetric $3 \times 3$ permanent and determinant tensors. We end with some remarks on binary tensors.
Rank (graph theory)
Cite
Citations (1)
The components of a symmetric second-rank tensor (e.g. the strain tensor) can be calculated from measurements of that tensor property in at least six directions. A system of simultaneous equations (t(m) = Σ6i = 1 amiti) must be solved. The coefficient matrix ami is determined by the measurement directions, and therefore it is of great importance to choose these directions properly. Part 1 deals with the necessary condition: rank (ami)<> 0. Rules are given by which it is easy to decide whether rank (ami) = 0 or rank (ami) ≷0 without calculating the rank. In part 2 methods are developed to find optimized sets of measurement directions. An optimized set is one for which the inevitable measurement errors will give errors of the calculated tensor components as small as possible. Special attention is given to the case of X-ray strain measurement of single crystals.
Rank (graph theory)
Infinitesimal strain theory
Matrix (chemical analysis)
Strain (injury)
Cite
Citations (21)