An Improved Non-negative Latent Factor Model via Momentum-Based Additive Gradient Descent Method

2021 
Nonnegative latent factor (NLF) model has a great ability of acquiring useful knowledge from symmetric, high-dimensional, sparse (SHiDS) matrices. However, the tradition NLF model in terms of the double factorization (DF) technique is somewhat conservative due to the inadequate consideration of difference cases of NLFs. In order to address this issue, we propose an improved DF-SNLF (IDF-SNLF) model to extract NLF matrices in various scenarios. Meanwhile, single NLF-dependent and multiplicative update (SNL-MU) learning method is employed to build an NLF model on SHiDS matrices, yet it suffers from a fairly low convergence rate. So, for the purpose of accelerating the convergence rate, a so-called momentum-based additive gradient descent (MAGD) method is adopted to train the model. Empirical studies on two SHiDS matrices demonstrate that our proposed IDF-SNLF model with MAGD can obtain a desirable performance on both prediction accuracy for missing data and the algorithm convergence rate.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    0
    Citations
    NaN
    KQI
    []