# Wiener–Khinchin theorem

The Wiener–Khinchin theorem (also known as the Wiener–Khintchine theorem and sometimes as the Wiener–Khinchin–Einstein theorem or the Khinchin–Kolmogorov theorem) states that the autocorrelation function of a wide-sense-stationary random process has a spectral decomposition given by the power spectrum of that process[1][2][3][4][5][6][7]

## History

Norbert Wiener first published this theorem in 1930;[8]Aleksandr Khinchin independently[9] discovered the result and published it in 1934.[10]Albert Einstein had probably anticipated the idea in a brief two-page memo in 1914.[11]

↑Jump back a section

## The case of a continuous time process

For continuous time, the Wiener—Khinchin theorem [12][13] says that if $x$ is a wide-sense stationary process such that its autocorrelation function (sometimes called autocovariance) defined in terms of statistical expected value E, $r_{xx}(\tau) = \operatorname{E}\big[\, x(t)x^*(t-\tau) \, \big] \$ exists and is finite at every lag $\tau$, then there exists a monotone function $F(f)$ in the frequency domain $-\infty < f < \infty$ such that

$r_{xx} (\tau) = \int_{-\infty}^\infty e^{2\pi i\tau f} dF(f)$

where the integral is a Stieltjes integral. This is a kind of spectral decomposition of the auto-correlation function. F is called the power spectral distribution function, and is a statistical distribution function. It is sometimes called the integrated spectrum.

(The asterisk denotes complex conjugate, and of course it can be omitted if the random process is real-valued.)

Note that the Fourier transform of $x(t)\,$ does not exist in general, because stationary random functions are not generally either square integrable or absolutely integrable. Nor is $r_{xx}$ assumed to be absolutely integrable, so it need not have a Fourier transform, either.

But if $F(f)$ is absolutely continuous, for example if the process is purely indeterministic, then one can define the power spectral density of $x(t)\,$ by taking the derivative of $F$, putting $S_{xx}(f) = F'(f)$ almost everywhere,[14] and the theorem simplifies to

$r_{xx} (\tau) = \int_{-\infty}^\infty S_{xx}(f) e^{2\pi i\tau f} df.$

If now one assumes that r and S satisfy the necessary conditions for Fourier inversion to be valid, the Wiener—Khinchin theorem takes the simple form of saying that r and S are a Fourier transform pair, and

$S_{xx}(f) = \int_{-\infty}^\infty r_{xx} (\tau) e^{-2\pi if\tau} d\tau.$

↑Jump back a section

## The case of a discrete time process

For the discrete-time case, the power spectral density of the function with discrete values $x[n]\,$ is

$S_{xx}(f)=\sum_{k=-\infty}^\infty r_{xx}[k]e^{-i(2\pi f) k}$,

where

$r_{xx}[k] = \operatorname{E}\big[ \, x[n] x^*[n-k] \, \big] \$

is the discrete autocorrelation function of $x[n]\,$, provided this is absolutely integrable. Being a sampled and discrete-time sequence, the spectral density is periodic in the frequency domain.

↑Jump back a section

## Application

The theorem is useful for analyzing linear time-invariant systems, LTI systems, when the inputs and outputs are not square integrable, so their Fourier transforms do not exist. A corollary is that the Fourier transform of the autocorrelation function of the output of an LTI system is equal to the product of the Fourier transform of the autocorrelation function of the input of the system times the squared magnitude of the Fourier transform of the system impulse response.[15] This works even when the Fourier transforms of the input and output signals do not exist because these signals are not square integrable, so the system inputs and outputs cannot be directly related by the Fourier transform of the impulse response.

Since the Fourier transform of the autocorrelation function of a signal is the power spectrum of the signal, this corollary is equivalent to saying that the power spectrum of the output is equal to the power spectrum of the input times the power transfer function.

This corollary is used in the parametric method for power spectrum estimation.

↑Jump back a section

## Discrepancies in terminology

In many textbooks and in much of the technical literature it is tacitly assumed that Fourier inversion of the autocorrelation function and the power spectral density is valid, and the Wiener—Khinchin theorem is stated, very simply, as if it said that the Fourier transform of the autocorrelation function was equal to the power spectral density, ignoring all questions of convergence.[16] (Einstein is an example.) But the theorem (as stated here), was applied by Norbert Wiener and Aleksandr Khinchin to the sample functions (signals) of wide-sense-stationary random processes, signals whose Fourier transforms do not exist. The whole point of Wiener's contribution was to make sense of the spectral decomposition of the autocorrelation function of a sample function of a wide-sense-stationary random process even when the integrals for the Fourier transform and Fourier inversion do not make sense.

Some authors refer to R as the autocovariance function. They then proceed to normalise it, by dividing by R(0), to obtain what they refer to as the autocorrelation function.

↑Jump back a section

## Notes

Brockwell, Peter A., and Davis, Richard J., Introduction to Times Series and Forecasting, Second Edition, Springer-Verlag, New York, 2002.

Chatfield, C., The Analysis of Time Series—An Introduction, fourth ed., Chapman and Hall, London, 1989.

Fuller, Wayne, Introduction to Statistical Time Series (Wiley Series in Probability and Statistics) second ed., Wiley, New York, 1996.

Wiener, Norbert, Extrapolation, Interpolation, and Smoothing of Stationary Time Series, Technology Press and Johns Hopkins Univ. Press, Cambridge, Massachusetts, 1949 (a classified document written for the Dept. of War in 1943).

Yaglom, A.M. An Introduction to the Theory of Stationary Random Functions, Prentice-Hall, Englewood Cliffs, New Jersey, 1962.

1. ^ C. Chatfield (1989). The Analysis of Time Series—An Introduction (fourth ed.). Chapman and Hall, London. pp. 94–95. ISBN 0-412-31820-2.
2. ^ Norbert Wiener (1964). Time Series. M.I.T. Press, Cambridge, Massachusetts. p. 42.
3. ^ Hannan, E.J., "Stationary Time Series", in: John Eatwell, Murray Milgate, and Peter Newman, editors, The New Palgrave: A Dictionary of Economics. Time Series and Statistics, Macmillan, London, 1990, p. 271.
4. ^ Dennis Ward Ricker (2003). Echo Signal Processing. Springer. ISBN 1-4020-7395-X.
5. ^ Leon W. Couch II (2001). Digital and Analog Communications Systems (sixth ed.). Prentice Hall, New Jersey. pp. 406–409. ISBN 0-13-522583-3.
6. ^ Krzysztof Iniewski (2007). Wireless Technologies: Circuits, Systems, and Devices. CRC Press. ISBN 0-8493-7996-2.
7. ^ Joseph W. Goodman (1985). Statistical Optics. Wiley-Interscience. ISBN 0-471-01502-4.
8. ^ Wiener, Norbert (1930). "Generalized Harmonic Analysis". Acta Mathematica 55: 117–258.
9. ^ Nahin, Paul J. (2011). Dr. Euler's Fabulous Formula: Cures Many Mathematical Ills. Princeton University Press. p. 225. ISBN 9780691150376.
10. ^ Khintchine, A. (1934). "Korrelationstheorie der stazionaren stochastischen Prozesse". Mathematische Annalen 109: 604–615.
11. ^ Jerison, David; Singer, Isadore Manuel; Stroock, Daniel W. (1997). The Legacy of Norbert Wiener: A Centennial Symposium (Proceedings of Symposia in Pure Mathematics). American Mathematical Society. p. 95. ISBN 0-8218-0415-4.
12. ^ Hannan, E.J., "Stationary Time Series", in: John Eatwell, Murray Milgate, and Peter Newman, editors, The New Palgrave: A Dictionary of Economics. Time Series and Statistics, Macmillan, London, 1990, p. 271.
13. ^ C. Chatfield (1989). The Analysis of Time Series—An Introduction (fourth ed.). Chapman and Hall, London. pp. 94–95. ISBN 0-412-31820-2.
14. ^ C. Chatfield (1989). The Analysis of Time Series—An Introduction (fourth ed.). Chapman and Hall, London. p. 96. ISBN 0-412-31820-2.
15. ^ Shlomo Engelberg (2007). Random signals and noise: a mathematical introduction. CRC Press. p. 130. ISBN 978-0-8493-7554-5.
16. ^ C. Chatfield (1989). The Analysis of Time Series—An Introduction (fourth ed.). Chapman and Hall, London. p. 98. ISBN 0-412-31820-2.
↑Jump back a section