In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that bounds the total variation distance (or statistical distance) in terms of the Kullback–Leibler divergence.The inequality is tight up to constant factors. In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that bounds the total variation distance (or statistical distance) in terms of the Kullback–Leibler divergence.The inequality is tight up to constant factors. Pinsker's inequality states that, if P {displaystyle P} and Q {displaystyle Q} are two probability distributions on a measurable space ( X , Σ ) {displaystyle (X,Sigma )} , then