Least Squares Estimation for the Gauss–Markov Model

2020 
Now that we have determined which functions cTβ in a linear model {y, Xβ} can be estimated unbiasedly, we can consider how we might actually estimate them. This chapter presents the estimation method known as least squares. Least squares estimation involves finding a value of β that minimizes the distance between y and Xβ, as measured by the squared length of the vector y −Xβ. Although this seems like it should lead to reasonable estimators of the elements of Xβ, it is not obvious that it will lead to estimators of all estimable functions cTβ that are optimal in any sense. It is shown in this chapter that the least squares estimator of any estimable function cTβ associated with the model {y, Xβ} is linear and unbiased under that model, and that it is, in fact, the “best” (minimum variance) linear unbiased estimator of cTβ under the Gauss–Markov model {y, Xβ, σ2I}. It is also shown that if the mean structure of the model is reparameterized, the least squares estimators of estimable functions are materially unaffected.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    1
    References
    0
    Citations
    NaN
    KQI
    []