language-icon Old Web
English
Sign In

Kernel regression

Kernel regression is a non-parametric technique in statistics to estimate the conditional expectation of a random variable. The objective is to find a non-linear relation between a pair of random variables X and Y. Kernel regression is a non-parametric technique in statistics to estimate the conditional expectation of a random variable. The objective is to find a non-linear relation between a pair of random variables X and Y. In any nonparametric regression, the conditional expectation of a variable Y {displaystyle Y} relative to a variable X {displaystyle X} may be written: E ⁡ ( Y | X ) = m ( X ) {displaystyle operatorname {E} (Y|X)=m(X)} where m {displaystyle m} is an unknown function. Nadaraya and Watson, both in 1964, proposed to estimate m {displaystyle m} as a locally weighted average, using a kernel as a weighting function. The Nadaraya–Watson estimator is: m ^ h ( x ) = ∑ i = 1 n K h ( x − x i ) y i ∑ j = 1 n K h ( x − x j ) {displaystyle {widehat {m}}_{h}(x)={frac {sum _{i=1}^{n}K_{h}(x-x_{i})y_{i}}{sum _{j=1}^{n}K_{h}(x-x_{j})}}} where K h {displaystyle K_{h}} is a kernel with a bandwidth h {displaystyle h} . The denominator is a weighting term with sum 1. E ⁡ ( Y | X = x ) = ∫ y f ( y | x ) d y = ∫ y f ( x , y ) f ( x ) d y {displaystyle operatorname {E} (Y|X=x)=int yf(y|x)dy=int y{frac {f(x,y)}{f(x)}}dy} Using the kernel density estimation for the joint distribution f(x,y) and f(x) with a kernel K,

[ "Kernel (statistics)", "Kernel method", "Regression", "Nonparametric statistics", "Estimator" ]
Parent Topic
Child Topic
    No Parent Topic