language-icon Old Web
English
Sign In

Polynomial least squares

In mathematical statistics, polynomial least squares comprises a broad range of statistical methods for estimating an underlying polynomial that describes observations. These methods include polynomial regression, curve fitting, linear regression, least squares, ordinary least squares, simple linear regression, linear least squares, approximation theory and method of moments. Polynomial least squares has applications in radar trackers, estimation theory, signal processing, statistics, and econometrics. In mathematical statistics, polynomial least squares comprises a broad range of statistical methods for estimating an underlying polynomial that describes observations. These methods include polynomial regression, curve fitting, linear regression, least squares, ordinary least squares, simple linear regression, linear least squares, approximation theory and method of moments. Polynomial least squares has applications in radar trackers, estimation theory, signal processing, statistics, and econometrics. Two common applications of polynomial least squares methods are generating a low-degree polynomial that approximates a complicated function and estimating an assumed underlying polynomial from corrupted (also known as 'noisy') observations. The former is commonly used in statistics and econometrics to fit a scatter plot with a first degree polynomial (that is, a linear expression). The latter is commonly used in target tracking in the form of Kalman filtering, which is effectively a recursive implementation of polynomial least squares. Estimating an assumed underlying deterministic polynomial can be used in econometrics as well. In effect, both applications produce average curves as generalizations of the common average of a set of numbers, which is equivalent to zero degree polynomial least squares.

[ "Polynomial", "Least squares" ]
Parent Topic
Child Topic
    No Parent Topic