A Fast Randomized Algorithm for Orthogonal Projection

2011 
We describe an algorithm that, given any full-rank matrix $A$ having fewer rows than columns, can rapidly compute the orthogonal projection of any vector onto the null space of $A$, as well as the orthogonal projection onto the row space of $A$, provided that both $A$ and its adjoint $A^*$ can be applied rapidly to arbitrary vectors. As an intermediate step, the algorithm solves the overdetermined linear least-squares regression involving $A^*$ and may therefore be used for this purpose as well. In many circumstances, the technique can accelerate interior-point methods for convex optimization, including linear programming (see, for example, Chapter 11 of [S. J. Wright, Primal-Dual Interior-Point Methods, SIAM, Philadelphia, 1997]). The basis of the algorithm is an obvious but numerically unstable scheme (typically known as the method of normal equations); suitable use of a preconditioner yields numerical stability. We generate the preconditioner rapidly via a randomized procedure that succeeds with extremely high probability. We provide numerical examples demonstrating the superior accuracy of the randomized method over direct use of the normal equations.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    11
    References
    34
    Citations
    NaN
    KQI
    []