language-icon Old Web
English
Sign In

Change of basis

In linear algebra, a basis for a vector space of dimension n is a set of n vectors (α1, …, αn), called basis vectors, with the property that every vector in the space can be expressed as a unique linear combination of the basis vectors. The matrix representations of operators are also determined by the chosen basis. Since it is often desirable to work with more than one basis for a vector space, it is of fundamental importance in linear algebra to be able to easily transform coordinate-wise representations of vectors and operators taken with respect to one basis to their equivalent representations with respect to another basis. Such a transformation is called a change of basis. In linear algebra, a basis for a vector space of dimension n is a set of n vectors (α1, …, αn), called basis vectors, with the property that every vector in the space can be expressed as a unique linear combination of the basis vectors. The matrix representations of operators are also determined by the chosen basis. Since it is often desirable to work with more than one basis for a vector space, it is of fundamental importance in linear algebra to be able to easily transform coordinate-wise representations of vectors and operators taken with respect to one basis to their equivalent representations with respect to another basis. Such a transformation is called a change of basis. Although the terminology of vector spaces is used below and the symbol R can be taken to mean the field of real numbers, the results discussed hold whenever R is a commutative ring and vector space is everywhere replaced with free R-module. The standard basis for R n {displaystyle R^{n}} is the ordered sequence E n = { e 1 , ⋯ , e n } {displaystyle E_{n}={e_{1},cdots ,e_{n}}} , where e j {displaystyle e_{j}} is the element of R n {displaystyle R^{n}} with 1 {displaystyle 1} in the j th {displaystyle j^{ ext{th}}} place and 0 {displaystyle 0} s elsewhere. For example, the standard basis for R 2 {displaystyle R^{2}} would be If T : R n → R m {displaystyle T:R^{n} ightarrow R^{m}} is a linear transformation, the m × n {displaystyle m imes n} matrix associated with T {displaystyle T} is the matrix M T {displaystyle M_{T}} whose j t h {displaystyle jth} column is T ( e j ) ∈ R m {displaystyle T(e_{j})in R^{m}} , for j = 1 , ⋯ , n {displaystyle j=1,cdots ,n} , that is In this case we have T ( x ) = M T ⋅ x {displaystyle T(x)=M_{T}cdot x} , ∀ x ∈ R n {displaystyle forall xin R^{n}} , where we regard x {displaystyle x} as a column vector and the multiplication on the right side is matrix multiplication. It is a basic fact in linear algebra that the vector space Hom( R n , R m {displaystyle R^{n},R^{m}} ) of all linear transformations from R n {displaystyle R^{n}} to R m {displaystyle R^{m}} is naturally isomorphic to the space R m × n {displaystyle R^{m imes n}} of m × n {displaystyle m imes n} matrices over R {displaystyle R} ; that is, a linear transformation T : R n → R m {displaystyle T:R^{n} ightarrow R^{m}} is for all intents and purposes equivalent to its matrix M T {displaystyle M_{T}} .

[ "Matrix (mathematics)", "Basis (linear algebra)" ]
Parent Topic
Child Topic
    No Parent Topic