Kolmogorov's Three-Series Theorem

The Kolmogorov's Three Series Theorem is a fundamental result in probability theory that provides conditions under which the sum of an infinite series of independent identically distributed random variables converges almost surely.
Suppose that X1,X2,X_1, X_2, \ldots is a sequence of independent random variables, Sn=X1++XnS_n = X_1 + \ldots + X_n, and let AA be the set of sample points ω\omega for which i>0Xi(ω)\sum_{i > 0} X_i(\omega) converges to a finite limit. It follows from Kolmogorov's zero-one law that P(A)=0P(A) = 0 or 1, i.e. the series i>0Xi(ω)\sum_{i > 0} X_i(\omega) converges or diverges almost surely (a.s.). The aim of this article is to give criteria that will determine whether a sum of independent random variables converges or diverges.

Theorem about convergence independent centered random variables

The foolowing theorem is Theorem 1, page 6 [Shiryaev]

Kolmogorov's Three-Series Theorem

The foolowing theorem is Theorem 3, page 9 [Shiryaev]

References