language-icon Old Web
English
Sign In

Entropic uncertainty

In quantum mechanics, information theory, and Fourier analysis, the entropic uncertainty or Hirschman uncertainty is defined as the sum of the temporal and spectral Shannon entropies. It turns out that Heisenberg's uncertainty principle can be expressed as a lower bound on the sum of these entropies. This is stronger than the usual statement of the uncertainty principle in terms of the product of standard deviations. H ( | f | 2 ) + H ( | g | 2 ) ≥ log ⁡ e 2   , {displaystyle H(|f|^{2})+H(|g|^{2})geq log {frac {e}{2}}~,} In quantum mechanics, information theory, and Fourier analysis, the entropic uncertainty or Hirschman uncertainty is defined as the sum of the temporal and spectral Shannon entropies. It turns out that Heisenberg's uncertainty principle can be expressed as a lower bound on the sum of these entropies. This is stronger than the usual statement of the uncertainty principle in terms of the product of standard deviations. In 1957, Hirschman considered a function f and its Fourier transform g such that where the '≈' indicates convergence in L2, and normalized so that (by Plancherel's theorem),

[ "Uncertainty principle" ]
Parent Topic
Child Topic
    No Parent Topic