On the convergence of a randomized block coordinate descent algorithm for a matrix least squares problem

2021 
Abstract Randomized block coordinate descent type methods have been demonstrated to be efficient for solving large-scale optimization problems. Linear convergence to the unique solution is established if the objective function is strongly convex. In this paper we propose a randomized block coordinate descent algorithm for solving the matrix least squares problem min X ∈ R m × n ‖ C − AXB ‖ F with A ∈ R p × m , B ∈ R n × q , and C ∈ R p × q . We prove that the proposed algorithm converges linearly to the unique minimum norm least squares solution (i.e., A † C B † ) without the strong convexity assumption. Instead, we only need that B has full row rank. Numerical experiments are given to illustrate the theoretical results.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    13
    References
    0
    Citations
    NaN
    KQI
    []