The Kolmogorov's Three Series Theorem is a fundamental result in probability theory that provides conditions under which the sum of an infinite series of independent identically distributed random variables converges almost surely.
Suppose that X1,X2,… is a sequence of independent random variables, Sn=X1+…+Xn, and let A be the set of sample points ω for which ∑i>0Xi(ω) converges to a finite limit. It follows from Kolmogorov's zero-one law that P(A)=0 or 1, i.e. the series ∑i>0Xi(ω) converges or diverges almost surely (a.s.). The aim of this article is to give criteria that will determine whether a sum of independent random variables converges or diverges.
Theorem about convergence independent centered random variables
The foolowing theorem is Theorem 1, page 6 [Shiryaev]
Theorem about convergence independent centered random variables
The foolowing theorem is Theorem 1, page 6 [Shiryaev]
This result is due to Kolmogorov and Khinchin.
Theorem (Kolmogorov and Khinchin) Suppose that X1,X2,… is a sequence of independent random variables and EXn=0,n≥1. Ifn∑EXn2<∞,then the series ∑nXn converges a.s. Moreover, if the random variables {Xn,n≥1}, are uniformly bounded (i.e. P(∣Xn∣≤c=1,c<∞)the converse is true: the convergence of ∑nXn a.s. implies (1).
Proof.Necessity. The sequence {Sn,n≥1}, converges a.s., if and only if it is fundamental a.s.. The sequence {Sn,n≥1}, is fundamental a.s. if and only ifP{k≥1sup∣Sn+k−Sn∣≥ε}→0,n→∞.From Kolmogorov's inequality, we getP{k≥1sup∣Sn+k−Sn∣≥ε}=N→∞limP{1≤k≤Nmax∣Sn+k−Sn∣≥ε}≤N→∞limε2∑k=nn+NEXk2=ε2∑k=n∞EXk2.Therefore (2) is satisfied if ∑k=1∞EXk2<∞, and consequently Sn is Cauchy sequence with a.s. and hence limn→∞Sn(ω) exists a.s.
Sufficiency. Now, let ∑kXk converge. Then, by (2), for sufficiently large n,P{k≥1sup∣Sn+k−Sn∣≥ε}<21.By the second part of Kolmogorov's inequality,P{k≥1sup∣Sn+k−Sn∣≥ε}≥1−∑k=n∞EXk2(c+ε)2.Therefore if we suppose that ∑k=1∞EXk2=∞, we obtainP{k≥1sup∣Sn+k−Sn∣≥ε}=1,which contradicts (3).This completes the proof of the theorem. □
Kolmogorov's Three-Series Theorem
The foolowing theorem is Theorem 3, page 9 [Shiryaev]
Kolmogorov's Three-Series Theorem
The foolowing theorem is Theorem 3, page 9 [Shiryaev]
Let c be a constant andXc={X,0,∣X∣≤c,∣X∣>c.
Theorem (Kolmogorov's Three-Series Theorem) Let X1,X2,… be a sequence of independent random variables. A necessary and sufficient condition for the convergence of ∑Xn a.s. is that the series∑EXnc,∑VXnc,∑P(∣Xn∣≥c)converge for some c>0.
Let's prove this theorem.
Proof of Sufficiency
Proof. To prove it, let μn=EXnc. Convergence of ∑VXnc and Theorem of Kolmogorov and Khinchin imply that ∑n=1∞(Xnc−μn) converges a.s. Convergence of ∑μn now gives that ∑Xnc converges a.s.But if ∑P(∣Xn∣≥c)<∞, then by the Borel-Cantelli lemma there is number M such that ∀m>M,Xm≤c . Therefore Xn=Xnc for all n starting from M. Therefore ∑Xn also converges a.s..
Proof of Necessity
If ∑Xn converges a.s. then Xn→0 a.s., and therefore, for every c>0, at most a finite number of the events {∣Xn∣≥c} can occur a.s.. Therefore ∑I(∣Xn∣≥c)<∞ a.s., and, by the second part of the Borel-Cantelli lemma, ∑P(∣Xn∣>c)<∞. Moreover, the convergence of ∑Xn implies the convergence of ∑Xnc. To prove of convergence of the both series ∑EXnc and ∑VXnc we will use symmetrization method. In addition to the sequenceX1c,X2c,…, we consider a different sequence, X1c^,X2c^,… of independent random variable such Xnc has the same distribution as Xnc^,n≥n. Then if ∑nXnc converges a.s., the series ∑nX1c^ also converges, and hence so does ∑n(Xnc−Xnc^). But E(Xn−Xnc^)=0 and P(Xnc−Xnc^≤2c)=1. Therefore ∑nV(Xnc−Xnc^)<∞ by Theorem of Kolmogorov and Khinchin. In addition,n∑VXn=21n∑V(Xnc−Xnc^)<∞.Consequently, by Theorem of Kolmogorov and Khinchin, ∑(Xnc−EXnc^) converges witha.s. , and therefore ∑nEXnc converges. Thus if ∑nXnc converges a.s. (and P(∣Xnc∣≤c)=1,n≥1 ) it follows that both ∑nEXnc and ∑nVXnc converge. □