Decomposition of a tensor into multilinear rank-$(1, L_r,L_r)$ terms.

2018 
Canonical Polyadic Decomposition (CPD) represents a third-order tensor as the minimal sum of rank-$1$ terms. Because of its uniqueness properties the CPD has found many concrete applications in telecommunication, array processing, machine learning, etc. On the other hand, in several applications the rank-$1$ constraint on the terms is too restrictive. A multilinear rank-$(M,N,L)$ constraint (where a rank-$1$ term is the special case for which $M=N=L=1$) could be more realistic, while it still yields a decomposition with attractive uniqueness properties. In this paper we focus on the decomposition of a tensor $\mathcal T$ into a sum of multilinear rank-$(1,L_r,L_r)$ terms, $r=1,\dots,R$. This particular decomposition type has already found applications in wireless communication, chemometrics and the blind signal separation of signals that can be modelled as exponential polynomials and rational functions. We find conditions on the terms which guarantee that the decomposition is unique and can be computed by means of the eigenvalue decomposition of a matrix. We consider both the case where the decomposition is exact and the case where the decomposition holds only approximately. We show that in both cases the number of the terms $R$ and their "sizes" $L_1,\dots,L_R$ do not have to be known a priori and can be estimated as well. The conditions for uniqueness are easy to verify, especially for terms that can be considered "generic". In particular, in the case $L_1=\dots=L_R=:L$, we show that the multilinear rank-$(1,L,L)$ decomposition of an $I\times J\times K$ tensor is generically unique if $R\leq \min((J-L)(K-L),I)$, which generalizes a well known result on generic uniqueness of the CPD (i.e., the case $L=1$).
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    30
    References
    1
    Citations
    NaN
    KQI
    []