language-icon Old Web
English
Sign In

Kolmogorov's three-series theorem

In probability theory, Kolmogorov's Three-Series Theorem, named after Andrey Kolmogorov, gives a criterion for the almost sure convergence of an infinite series of random variables in terms of the convergence of three different series involving properties of their probability distributions. Kolmogorov's three-series theorem, combined with Kronecker's lemma, can be used to give a relatively easy proof of the Strong Law of Large Numbers. In probability theory, Kolmogorov's Three-Series Theorem, named after Andrey Kolmogorov, gives a criterion for the almost sure convergence of an infinite series of random variables in terms of the convergence of three different series involving properties of their probability distributions. Kolmogorov's three-series theorem, combined with Kronecker's lemma, can be used to give a relatively easy proof of the Strong Law of Large Numbers. Let ( X n ) n ∈ N {displaystyle (X_{n})_{nin mathbb {N} }} be independent random variables. The random series ∑ n = 1 ∞ X n {displaystyle sum _{n=1}^{infty }X_{n}} converges almost surely in R {displaystyle mathbb {R} } if and only if the following conditions hold for some A > 0 {displaystyle A>0} : Condition (i) and Borel–Cantelli give that X n = Y n {displaystyle X_{n}=Y_{n}} for n {displaystyle n} large, almost surely. Hence ∑ n = 1 ∞ X n {displaystyle extstyle sum _{n=1}^{infty }X_{n}} converges if and only if ∑ n = 1 ∞ Y n {displaystyle extstyle sum _{n=1}^{infty }Y_{n}} converges. Conditions (ii)-(iii) and Kolmogorov's Two-Series Theorem give the almost sure convergence of ∑ n = 1 ∞ Y n {displaystyle extstyle sum _{n=1}^{infty }Y_{n}} . Suppose that ∑ n = 1 ∞ X n {displaystyle extstyle sum _{n=1}^{infty }X_{n}} converges almost surely. Without condition (i), by Borel–Cantelli there would exist some A > 0 {displaystyle A>0} such that { | X n | ≥ A } {displaystyle {|X_{n}|geq A}} for infinitely many n {displaystyle n} , almost surely. But then the series would diverge. Therefore, we must have condition (i). We see that condition (iii) implies condition (ii): Kolmogorov's two-series theorem along with condition (i) applied to the case A = 1 {displaystyle A=1} gives the convergence of ∑ n = 1 ∞ ( Y n − E [ Y n ] ) {displaystyle extstyle sum _{n=1}^{infty }(Y_{n}-mathbb {E} )} . So given the convergence of ∑ n = 1 ∞ Y n {displaystyle extstyle sum _{n=1}^{infty }Y_{n}} , we have ∑ n = 1 ∞ E [ Y n ] {displaystyle extstyle sum _{n=1}^{infty }mathbb {E} } converges, so condition (ii) is implied. Thus, it only remains to demonstrate the necessity of condition (iii), and we will have obtained the full result. It is equivalent to check condition (iii) for the series ∑ n = 1 ∞ Z n = ∑ n = 1 ∞ ( Y n − Y n ′ ) {displaystyle extstyle sum _{n=1}^{infty }Z_{n}= extstyle sum _{n=1}^{infty }(Y_{n}-Y'_{n})} where for each n {displaystyle n} , Y n {displaystyle Y_{n}} and Y n ′ {displaystyle Y'_{n}} are IID—that is, to employ the assumption that E [ Y n ] = 0 {displaystyle mathbb {E} =0} , since Z n {displaystyle Z_{n}} is a sequence of random variables bounded by 2, converging almost surely, and with v a r ( Z n ) = 2 v a r ( Y n ) {displaystyle mathrm {var} (Z_{n})=2mathrm {var} (Y_{n})} . So we wish to check that if ∑ n = 1 ∞ Z n {displaystyle extstyle sum _{n=1}^{infty }Z_{n}} converges, then ∑ n = 1 ∞ v a r ( Z n ) {displaystyle extstyle sum _{n=1}^{infty }mathrm {var} (Z_{n})} converges as well. This is a special case of a more general result from martingale theory with summands equal to the increments of a martingale sequence and the same conditions ( E [ Z n ] = 0 {displaystyle mathbb {E} =0} ; the series of the variances is converging; and the summands are bounded). As an illustration of the theorem, consider the example of the harmonic series with random signs: Here, ' ± {displaystyle pm } ' means that each term 1 / n {displaystyle 1/n} is taken with a random sign that is either 1 {displaystyle 1} or − 1 {displaystyle -1} with respective probabilities 1 / 2 ,   1 / 2 {displaystyle 1/2, 1/2} , and all random signs are chosen independently. Let X n {displaystyle X_{n}} in the theorem denote a random variable that takes the values 1 / n {displaystyle 1/n} and − 1 / n {displaystyle -1/n} with equal probabilities. With A = 2 {displaystyle A=2} the summands of the first two series are identically zero and var(Yn)= n − 2 {displaystyle n^{-2}} . The conditions of the theorem are then satisfied, so it follows that the harmonic series with random signs converges almost surely. On the other hand, the analogous series of (for example) square root reciprocals with random signs, namely

[ "Statistics", "Discrete mathematics", "Mathematical analysis" ]
Parent Topic
Child Topic
    No Parent Topic