An inexact alternating proximal gradient algorithm for nonnegative CP tensor decomposition

2021 
Nonnegative tensor decomposition has become increasingly important for multiway data analysis in recent years. The alternating proximal gradient (APG) is a popular optimization method for nonnegative tensor decomposition in the block coordinate descent framework. In this study, we propose an inexact version of the APG algorithm for nonnegative CANDECOMP/PARAFAC decomposition, wherein each factor matrix is updated by only finite inner iterations. We also propose a parameter warm-start method that can avoid the frequent parameter resetting of conventional APG methods and improve convergence performance. By experimental tests, we find that when the number of inner iterations is limited to around 10 to 20, the convergence speed is accelerated significantly without losing its low relative error. We evaluate our method on both synthetic and real-world tensors. The results demonstrate that the proposed inexact APG algorithm exhibits outstanding performance on both convergence speed and computational precision compared with existing popular algorithms.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    39
    References
    1
    Citations
    NaN
    KQI
    []