In physics, the optical theorem is a general law of wave scattering theory, which relates the forward scattering amplitude to the total cross section of the scatterer. It is usually written in the form In physics, the optical theorem is a general law of wave scattering theory, which relates the forward scattering amplitude to the total cross section of the scatterer. It is usually written in the form where f(0) is the scattering amplitude with an angle of zero, that is, the amplitude of the wave scattered to the center of a distant screen, and k is the wave vector in the incident direction. Because the optical theorem is derived using only conservation of energy, or in quantum mechanics from conservation of probability, the optical theorem is widely applicable and, in quantum mechanics, σ t o t {displaystyle sigma _{mathrm {tot} }} includes both elastic and inelastic scattering. Note that the above form is for an incident plane wave; a more general form involving arbitrary outgoing directions k' invented by Werner Heisenberg can be written The optical theorem implies that an object that scatters any light at all (or electrons, neutrons, etc) will have a nonzero forward scattering amplitude f(0). However, the physically observed field in the forward direction is the sum of the (nonzero) scattered field and the incident field, which may add to zero. The optical theorem was originally invented independently by Wolfgang von Sellmeier and Lord Rayleigh in 1871. Lord Rayleigh recognized the forward scattering amplitude in terms of the index of refraction as (where N is the number density of scatterers),which he used in a study of the color and polarization of the sky. The equation was later extended to quantum scattering theory by several individuals, and came to be known as the Bohr–Peierls–Placzek relation after an unpublished 1939 paper. It was first referred to as the 'optical theorem' in print in 1955 by Hans Bethe and Frederic de Hoffmann, after it had been known as a 'well known theorem of optics' for some time. The theorem can be derived rather directly from a treatment of a scalar wave. If a plane wave is incident along positive z axis on an object, then the wave amplitude a great distance away from the scatterer is approximately given by All higher terms, when squared, vanish more quickly than 1 / r 2 {displaystyle 1/r^{2}} , and so are negligible a great distance away. For large values of z {displaystyle z} and for small angles, a Taylor expansion gives us