language-icon Old Web
English
Sign In

Log sum inequality

The log sum inequality is an inequality, which is useful for proving several theorems in information theory. The log sum inequality is an inequality, which is useful for proving several theorems in information theory. Let a 1 , … , a n {displaystyle a_{1},ldots ,a_{n}} and b 1 , … , b n {displaystyle b_{1},ldots ,b_{n}} be nonnegative numbers. Denote the sum of all a i {displaystyle a_{i}} s by a {displaystyle a} and the sum of all b i {displaystyle b_{i}} s by b {displaystyle b} . The log sum inequality states that with equality if and only if a i b i {displaystyle {frac {a_{i}}{b_{i}}}} are equal for all i {displaystyle i} , in other words a i = c b i {displaystyle a_{i}=cb_{i}} for all i {displaystyle i} . Notice that after setting f ( x ) = x log ⁡ x {displaystyle f(x)=xlog x} we have where the inequality follows from Jensen's inequality since b i b ≥ 0 {displaystyle {frac {b_{i}}{b}}geq 0} , ∑ i b i b = 1 {displaystyle sum _{i}{frac {b_{i}}{b}}=1} , and f {displaystyle f} is convex. The log sum inequality can be used to prove several inequalities in information theory such as Gibbs' inequality or the convexity of Kullback-Leibler divergence. For example, to prove Gibbs' inequality it is enough to substitute p i {displaystyle p_{i}} s for a i {displaystyle a_{i}} s, and q i {displaystyle q_{i}} s for b i {displaystyle b_{i}} s to get The inequality remains valid for n = ∞ {displaystyle n=infty } provided that a < ∞ {displaystyle a<infty } and b < ∞ {displaystyle b<infty } .The proof above holds for any function g {displaystyle g} such that f ( x ) = x g ( x ) {displaystyle f(x)=xg(x)} is convex, such as all continuous non-decreasing functions. Generalizations to convex functions other than the logarithm is given in Csiszár, 2004.

[ "Inequality", "Rearrangement inequality", "Gronwall's inequality", "Kantorovich inequality", "Hölder's inequality", "Poincaré inequality" ]
Parent Topic
Child Topic
    No Parent Topic