language-icon Old Web
English
Sign In

Uncorrelated

In probability theory and statistics, two real-valued random variables, X {displaystyle X} , Y {displaystyle Y} , are said to be uncorrelated if their covariance, cov ⁡ [ X , Y ] = E ⁡ [ X Y ] − E ⁡ [ X ] E ⁡ [ Y ] {displaystyle operatorname {cov} =operatorname {E} -operatorname {E} operatorname {E} } , is zero. If two variables are uncorrelated, there is no linear relationship between them. X , Y  uncorrelated ⟺ E ⁡ [ X Y ] = E ⁡ [ X ] ⋅ E ⁡ [ Y ] {displaystyle X,Y{ ext{ uncorrelated}}quad iff quad operatorname {E} =operatorname {E} cdot operatorname {E} } In probability theory and statistics, two real-valued random variables, X {displaystyle X} , Y {displaystyle Y} , are said to be uncorrelated if their covariance, cov ⁡ [ X , Y ] = E ⁡ [ X Y ] − E ⁡ [ X ] E ⁡ [ Y ] {displaystyle operatorname {cov} =operatorname {E} -operatorname {E} operatorname {E} } , is zero. If two variables are uncorrelated, there is no linear relationship between them. Uncorrelated random variables have a Pearson correlation coefficient of zero, except in the trivial case when either variable has zero variance (is a constant). In this case the correlation is undefined. In general, uncorrelatedness is not the same as orthogonality, except in the special case where at least one of the two random variables has an expected value of 0. In this case, the covariance is the expectation of the product, and X {displaystyle X} and Y {displaystyle Y} are uncorrelated if and only if E ⁡ [ X Y ] = 0 {displaystyle operatorname {E} =0} . If X {displaystyle X} and Y {displaystyle Y} are independent, with finite second moments, then they are uncorrelated. However, not all uncorrelated variables are independent.:p. 155 Two random variables X , Y {displaystyle X,Y} are called uncorrelated if their covariance Cov ⁡ [ X , Y ] = E ⁡ [ ( X − E ⁡ [ X ] ) ( Y − E ⁡ [ Y ] ) ] {displaystyle operatorname {Cov} =operatorname {E} )(Y-operatorname {E} )]} is zero:p. 153:p. 121. Formally: Two complex random variables Z , W {displaystyle Z,W} are called uncorrelated if their covariance K Z W = E ⁡ [ ( Z − E ⁡ [ Z ] ) ( W − E ⁡ [ W ] ) ¯ ] {displaystyle operatorname {K} _{ZW}=operatorname {E} ){overline {(W-operatorname {E} )}}]} and their pseudo-covariance J Z W = E ⁡ [ ( Z − E ⁡ [ Z ] ) ( W − E ⁡ [ W ] ) ] {displaystyle operatorname {J} _{ZW}=operatorname {E} )(W-operatorname {E} )]} is zero, i.e. Z , W  uncorrelated ⟺ E ⁡ [ Z W ¯ ] = E ⁡ [ Z ] ⋅ E ⁡ [ W ¯ ]  and  E ⁡ [ Z W ] = E ⁡ [ Z ] ⋅ E ⁡ [ W ] {displaystyle Z,W{ ext{ uncorrelated}}quad iff quad operatorname {E} =operatorname {E} cdot operatorname {E} { ext{ and }}operatorname {E} =operatorname {E} cdot operatorname {E} } A set of two or more random variables X 1 , … , X n {displaystyle X_{1},ldots ,X_{n}} is called uncorrelated if each pair of them is uncorrelated. This is equivalent to the requirement that the non-diagonal elements of the autocovariance matrix K X X {displaystyle operatorname {K} _{mathbf {X} mathbf {X} }} of the random vector X = ( X 1 , … , X n ) T {displaystyle mathbf {X} =(X_{1},ldots ,X_{n})^{mathrm {T} }} are all zero. The autocovariance matrix is defined as: The claim is that U {displaystyle U} and X {displaystyle X} have zero covariance (and thus are uncorrelated), but are not independent.

[ "Algorithm", "Statistics" ]
Parent Topic
Child Topic
    No Parent Topic