language-icon Old Web
English
Sign In

Method of moments (statistics)

In statistics, the method of moments is a method of estimation of population parameters. In statistics, the method of moments is a method of estimation of population parameters. It starts by expressing the population moments (i.e., the expected values of powers of the random variable under consideration) as functions of the parameters of interest. Those expressions are then set equal to the sample moments. The number of such equations is the same as the number of parameters to be estimated. Those equations are then solved for the parameters of interest. The solutions are estimates of those parameters. The method of moments was introduced by Pafnuty Chebyshev in 1887. Suppose that the problem is to estimate k {displaystyle k} unknown parameters θ 1 , θ 2 , … , θ k {displaystyle heta _{1}, heta _{2},dots , heta _{k}} characterizing the distribution f W ( w ; θ ) {displaystyle f_{W}(w; heta )} of the random variable W {displaystyle W} . Suppose the first k {displaystyle k} moments of the true distribution (the 'population moments') can be expressed as functions of the θ {displaystyle heta } s: Suppose a sample of size n {displaystyle n} is drawn, resulting in the values w 1 , … , w n {displaystyle w_{1},dots ,w_{n}} . For j = 1 , … , k {displaystyle j=1,dots ,k} , let be the j-th sample moment, an estimate of μ j {displaystyle mu _{j}} . The method of moments estimator for θ 1 , θ 2 , … , θ k {displaystyle heta _{1}, heta _{2},ldots , heta _{k}} denoted by θ ^ 1 , θ ^ 2 , … , θ ^ k {displaystyle {widehat { heta }}_{1},{widehat { heta }}_{2},dots ,{widehat { heta }}_{k}} is defined as the solution (if there is one) to the equations: The method of moments is fairly simple and yields consistent estimators (under very weak assumptions), though these estimators are often biased. In some respects, when estimating parameters of a known family of probability distributions, this method was superseded by Fisher's method of maximum likelihood, because maximum likelihood estimators have higher probability of being close to the quantities to be estimated and are more often unbiased.

[ "Electronic engineering", "Optics", "Mathematical optimization", "Mathematical analysis", "Statistics", "method of moments analysis", "higher order method", "surface integral equation", "Message-oriented middleware", "volume integral equation" ]
Parent Topic
Child Topic
    No Parent Topic