Symmetry, Saddle Points, and Global Optimization Landscape of Nonconvex Matrix Factorization

2019 
We propose a general theory for studying the landscape of nonconvex optimization with underlying symmetric structures for a class of machine learning problems (e.g., low-rank matrix factorization, phase retrieval, and deep linear neural networks). In particular, we characterize the locations of stationary points and the null space of Hessian matrices of the objective function via the lens of invariant groups. As a major motivating example, we apply the proposed general theory to characterize the global landscape of the nonconvex optimization in low-rank matrix factorization problem. We illustrate how the rotational symmetry group gives rise to infinitely many nonisolated strict saddle points and equivalent global minima of the objective function. By explicitly identifying all stationary points, we divide the entire parameter space into three regions: ( $ \mathcal {R}_{1}$ ) the region containing the neighborhoods of all strict saddle points where the objective has negative curvature; ( $ \mathcal {R}_{2}$ ) the region containing neighborhoods of all global minima, where the objective enjoys strong convexity along certain directions; and ( $ \mathcal {R}_{3}$ ) the complement of the above regions, where the gradient has sufficiently large magnitude. We further extend our result to the matrix sensing problem. Such global landscape implies that strong global convergence guarantees for popular iterative algorithms with arbitrary initial solutions.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    26
    Citations
    NaN
    KQI
    []