language-icon Old Web
English
Sign In

Stein's unbiased risk estimate

In statistics, Stein's unbiased risk estimate (SURE) is an unbiased estimator of the mean-squared error of 'a nearly arbitrary, nonlinear biased estimator.' In other words, it provides an indication of the accuracy of a given estimator. This is important since the true mean-squared error of an estimator is a function of the unknown parameter to be estimated, and thus cannot be determined exactly. In statistics, Stein's unbiased risk estimate (SURE) is an unbiased estimator of the mean-squared error of 'a nearly arbitrary, nonlinear biased estimator.' In other words, it provides an indication of the accuracy of a given estimator. This is important since the true mean-squared error of an estimator is a function of the unknown parameter to be estimated, and thus cannot be determined exactly. The technique is named after its discoverer, Charles Stein. Let μ ∈ R d {displaystyle mu in {mathbb {R} }^{d}} be an unknown parameter and let x ∈ R d {displaystyle xin {mathbb {R} }^{d}} be a measurement vector whose components are independent and distributed normally with mean μ {displaystyle mu } and variance σ 2 {displaystyle sigma ^{2}} . Suppose h ( x ) {displaystyle h(x)} is an estimator of μ {displaystyle mu } from x {displaystyle x} , and can be written h ( x ) = x + g ( x ) {displaystyle h(x)=x+g(x)} , where g {displaystyle g} is weakly differentiable. Then, Stein's unbiased risk estimate is given by where g i ( x ) {displaystyle g_{i}(x)} is the i {displaystyle i} th component of the function g ( x ) {displaystyle g(x)} , and ‖ ⋅ ‖ {displaystyle |cdot |} is the Euclidean norm. The importance of SURE is that it is an unbiased estimate of the mean-squared error (or squared error risk) of h ( x ) {displaystyle h(x)} , i.e.

[ "Efficient estimator", "Minimax estimator", "Consistent estimator", "Bias of an estimator" ]
Parent Topic
Child Topic
    No Parent Topic