language-icon Old Web
English
Sign In

Pinsker's inequality

In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that bounds the total variation distance (or statistical distance) in terms of the Kullback–Leibler divergence.The inequality is tight up to constant factors. In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that bounds the total variation distance (or statistical distance) in terms of the Kullback–Leibler divergence.The inequality is tight up to constant factors. Pinsker's inequality states that, if P {displaystyle P} and Q {displaystyle Q} are two probability distributions on a measurable space ( X , Σ ) {displaystyle (X,Sigma )} , then

[ "Inequality", "Information theory", "Upper and lower bounds", "Divergence", "Kullback–Leibler divergence" ]
Parent Topic
Child Topic
    No Parent Topic