Space Alternating Variational Estimation and Kronecker Structured Dictionary Learning

2019 
In this paper, we address the fundamental problem of Sparse Bayesian Learning (SBL), where the received signal is a high-order tensor. We furthermore consider the problem of dictionary learning (DL), where the tensor observations are assumed to be generated from a Kronecker structured (KS) dictionary matrix multiplied by the sparse coefficients. Exploiting the tensorial structure results in a reduction in the number of degrees of freedom in the learning problem, since the dimensions of each of the factor matrices are significantly smaller than the matricized dictionary if we vectorize the observations. We propose a novel fast algorithm called space alternating variational estimation with dictionary learning (SAVED-KS), which is a version of variational Bayes (VB)-SBL pushed to the scalar level. Similarly, as for SAGE (space-alternating generalized expectation maximization) compared to EM, the component-wise approach of SAVED-KS compared to SBL renders it less likely to get stuck in bad local optima and its inherent damping (more cautious progression) also leads to typically faster convergence of the non-convex optimization process. Simulation results show that the proposed algorithm has a faster convergence rate and lower mean squared error (MSE) compared to the alternating least squares (ALS) based method for tensor decomposition.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    32
    References
    3
    Citations
    NaN
    KQI
    []