A Subspace Method for Large Scale Eigenvalue Optimization

2015 
We consider the minimization or maximization of the $J$th largest eigenvalue of an analytic and Hermitian matrix-valued function, and build on [Mengi et al. 2014]. This work addresses the setting when the matrix-valued function involved is very large. We describe a subspace approach that converts the original problem into a small scale one by means of orthogonal projections and restrictions to certain subspaces, and that gradually expands these subspaces based on the optimal solutions of small scale problems. Global convergence and rapid convergence results with respect to the dimensions of the subspaces are presented. All this convergence analysis is performed in the infinite dimensional setting, where the matrix-valued function is replaced by a compact operator depending on parameters. In practice, it suffices to solve eigenvalue optimization problems involving matrices with sizes on the scale of tens, instead of the original problem involving matrices with sizes on the scale of thousands.
    • Correction
    • Cite
    • Save
    • Machine Reading By IdeaReader
    27
    References
    1
    Citations
    NaN
    KQI
    []