Efficient Asynchronous Semi-Stochastic Block Coordinate Descent Methods for Large-Scale SVD

2021 
Eigenvector computation such as Singular Value Decomposition (SVD) is one of the most fundamental problems in machine learning, optimization and numerical linear algebra. In recent years, many stochastic variance reduction algorithms and randomized coordinate descent algorithms have been developed to efficiently solve the leading eigenvalue problem. By taking full advantage of both variance reduction and randomized coordinate descent techniques, this paper proposes a novel Semi-stochastic Block Coordinate Descent algorithm (SBCD-SVD), which is more suitable than existing algorithms for large-scale leading eigenvalue problems of SVD, and can obtain linear convergence. Unlike existing stochastic variance reduction and randomized coordinate descent methods, our algorithm inherits their advantages. Moreover, we propose a new Asynchronous parallel Semi-stochastic Block Coordinate Descent algorithm (ASBCD-SVD) and one new Asynchronous parallel Sparse approximated Variance Reduction algorithm (ASVR-SVD) for large-scale dense and sparse datasets, respectively. Finally, we prove that both dense and sparse asynchronous parallel variants can converge linearly. Extensive experimental results show that our algorithms attain high parallel speedup and achieve almost the same performance with significantly shorter time, and thus they can be widely used in various practice applications.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    45
    References
    0
    Citations
    NaN
    KQI
    []