language-icon Old Web
English
Sign In

Polar decomposition

In mathematics, particularly in linear algebra and functional analysis, the polar decomposition of a matrix or linear operator is a factorization analogous to the polar form of a nonzero complex number z as z = r e i θ {displaystyle z=re^{i heta },} where j2 = +1 and the arithmetic of split-complex numbers is used. The branch through (−1, 0) is traced by −eaj. Since the operation of multiplying by j reflects a point across the line y = x, the second hyperbola has branches traced by jeaj or −jeaj. Therefore a point in one of the quadrants has a polar decomposition in one of the forms:using the row-sum and column-sum matrix norms or In mathematics, particularly in linear algebra and functional analysis, the polar decomposition of a matrix or linear operator is a factorization analogous to the polar form of a nonzero complex number z as z = r e i θ {displaystyle z=re^{i heta },} where r is the absolute value of z (a positive real number), and e i θ {displaystyle e^{i heta }} is an element of the circle group. The polar decomposition of a square complex matrix A is a matrix decomposition of the form where U is a unitary matrix and P is a positive-semidefinite Hermitian matrix. Intuitively, the polar decomposition separates A into a component that stretches the space along a set of orthogonal axes, represented by P, and a rotation (with possible reflection) represented by U. The decomposition of the complex conjugate of A {displaystyle A} is given by A ¯ = U ¯ P ¯ . {displaystyle {overline {A}}={overline {U}}{overline {P}}.} This decomposition always exists; and so long as A is invertible, it is unique, with P positive-definite. Note that gives the corresponding polar decomposition of the determinant of A, since det P = r = | det A | {displaystyle det P=r=|det A|} and det U = e i θ {displaystyle det U=e^{i heta }} . In particular, if A {displaystyle A} has determinant 1 then both U {displaystyle U} and P {displaystyle P} have determinant 1. The matrix P is always unique, even if A is singular, and given by where A* denotes the conjugate transpose of A. This expression is ensured to be well-defined, since A ∗ A {displaystyle A^{*}A} is a positive-semidefinite Hermitian matrix, and therefore has a unique positive-semidefinite Hermitian square root. If A is invertible, then the matrix U is uniquely determined by Moreover, if A {displaystyle A} is invertible, then P {displaystyle P} is strictly positive definite, and thus has a unique self-adjoint logarithm. Every invertible matrix A {displaystyle A} can therefore be written uniquely in the form

[ "Matrix (mathematics)" ]
Parent Topic
Child Topic
    No Parent Topic