language-icon Old Web
English
Sign In

Bussgang theorem

In mathematics, the Bussgang theorem is a theorem of stochastic analysis. The theorem states that the crosscorrelation of a Gaussian signal before and after it has passed through a nonlinear operation are equal up to a constant. It was first published by Julian J. Bussgang in 1952 while he was at the Massachusetts Institute of Technology. In mathematics, the Bussgang theorem is a theorem of stochastic analysis. The theorem states that the crosscorrelation of a Gaussian signal before and after it has passed through a nonlinear operation are equal up to a constant. It was first published by Julian J. Bussgang in 1952 while he was at the Massachusetts Institute of Technology. Let { X ( t ) } {displaystyle left{X(t) ight}} be a zero-mean stationary Gaussian random process and { Y ( t ) } = g ( X ( t ) ) {displaystyle left{Y(t) ight}=g(X(t))} where g ( ⋅ ) {displaystyle g(cdot )} is a nonlinear amplitude distortion. If R X ( τ ) {displaystyle R_{X}( au )} is the autocorrelation function of { X ( t ) } {displaystyle left{X(t) ight}} , then the cross-correlation function of { X ( t ) } {displaystyle left{X(t) ight}} and { Y ( t ) } {displaystyle left{Y(t) ight}} is where C {displaystyle C} is a constant that depends only on g ( ⋅ ) {displaystyle g(cdot )} .

[ "Orthogonal frequency-division multiplexing", "Bit error rate", "Nonlinear distortion", "MIMO" ]
Parent Topic
Child Topic
    No Parent Topic