language-icon Old Web
English
Sign In

Orthogonality principle

In statistics and signal processing, the orthogonality principle is a necessary and sufficient condition for the optimality of a Bayesian estimator. Loosely stated, the orthogonality principle says that the error vector of the optimal estimator (in a mean square error sense) is orthogonal to any possible estimator. The orthogonality principle is most commonly stated for linear estimators, but more general formulations are possible. Since the principle is a necessary and sufficient condition for optimality, it can be used to find the minimum mean square error estimator. In statistics and signal processing, the orthogonality principle is a necessary and sufficient condition for the optimality of a Bayesian estimator. Loosely stated, the orthogonality principle says that the error vector of the optimal estimator (in a mean square error sense) is orthogonal to any possible estimator. The orthogonality principle is most commonly stated for linear estimators, but more general formulations are possible. Since the principle is a necessary and sufficient condition for optimality, it can be used to find the minimum mean square error estimator. The orthogonality principle is most commonly used in the setting of linear estimation. In this context, let x be an unknown random vector which is to be estimated based on the observation vector y. One wishes to construct a linear estimator x ^ = H y + c {displaystyle {hat {x}}=Hy+c} for some matrix H and vector c. Then, the orthogonality principle states that an estimator x ^ {displaystyle {hat {x}}} achieves minimum mean square error if and only if If x and y have zero mean, then it suffices to require the first condition. Suppose x is a Gaussian random variable with mean m and variance σ x 2 . {displaystyle sigma _{x}^{2}.} Also suppose we observe a value y = x + w , {displaystyle y=x+w,} where w is Gaussian noise which is independent of x and has mean 0 and variance σ w 2 . {displaystyle sigma _{w}^{2}.} We wish to find a linear estimator x ^ = h y + c {displaystyle {hat {x}}=hy+c} minimizing the MSE. Substituting the expression x ^ = h y + c {displaystyle {hat {x}}=hy+c} into the two requirements of the orthogonality principle, we obtain

[ "Efficient estimator", "Minimax estimator", "Consistent estimator", "Bias of an estimator" ]
Parent Topic
Child Topic
    No Parent Topic