language-icon Old Web
English
Sign In

QR decomposition

In linear algebra, a QR decomposition, also known as a QR factorization, is a decomposition of a matrix A into a product A = QR of an orthogonal matrix Q and an upper triangular matrix R. QR decomposition is often used to solve the linear least squares problem and is the basis for a particular eigenvalue algorithm, the QR algorithm. In linear algebra, a QR decomposition, also known as a QR factorization, is a decomposition of a matrix A into a product A = QR of an orthogonal matrix Q and an upper triangular matrix R. QR decomposition is often used to solve the linear least squares problem and is the basis for a particular eigenvalue algorithm, the QR algorithm. Any real square matrix A may be decomposed as where Q is an orthogonal matrix (its columns are orthogonal unit vectors meaning Q T Q = Q Q T = I {displaystyle Q^{ extsf {T}}Q=QQ^{ extsf {T}}=I} ) and R is an upper triangular matrix (also called right triangular matrix). If A is invertible, then the factorization is unique if we require the diagonal elements of R to be positive. If instead A is a complex square matrix, then there is a decomposition A = QR where Q is a unitary matrix (so Q ∗ Q = Q Q ∗ = I {displaystyle Q^{*}Q=QQ^{*}=I} ). If A has n linearly independent columns, then the first n columns of Q form an orthonormal basis for the column space of A. More generally, the first k columns of Q form an orthonormal basis for the span of the first k columns of A for any 1 ≤ k ≤ n. The fact that any column k of A only depends on the first k columns of Q is responsible for the triangular form of R. More generally, we can factor a complex m×n matrix A, with m ≥ n, as the product of an m×m unitary matrix Q and an m×n upper triangular matrix R. As the bottom (m−n) rows of an m×n upper triangular matrix consist entirely of zeroes, it is often useful to partition R, or both R and Q: where R1 is an n×n upper triangular matrix, 0 is an (m − n)×n zero matrix, Q1 is m×n, Q2 is m×(m − n), and Q1 and Q2 both have orthogonal columns. Golub & Van Loan (1996, §5.2) call Q1R1 the thin QR factorization of A; Trefethen and Bau call this the reduced QR factorization. If A is of full rank n and we require that the diagonal elements of R1 are positive then R1 and Q1 are unique, but in general Q2 is not. R1 is then equal to the upper triangular factor of the Cholesky decomposition of A* A (= ATA if A is real). Analogously, we can define QL, RQ, and LQ decompositions, with L being a lower triangular matrix.

[ "Matrix (mathematics)", "QR algorithm", "Givens rotation", "Householder transformation", "RRQR factorization" ]
Parent Topic
Child Topic
    No Parent Topic