This article gives systematic description of the inequalities and bounds used in probability and statistics.
Kolmogorov’s maximal inequality
Kolmogorov's inequality is one of the main inequalities in the probability theory, it provides the upper bound for the maximum of the sum of independent identical random variables.
Theorem (Kolmogorov’s maximal inequality)
A Let ξ1,ξ2,…,ξn be independent random variables with Eξi=0,Eξi2<∞, i≤n. If Sn=ξ1+…ξn thenP(1≤k≤nmax∣Sk∣≥ε)≤ε2ESn2.B If also P(∣ξi∣≤c)=1,i≤n, thenP{1≤k≤nmax∣Sk∣≥ε}≥1−ESn2(c+ε)2. Proof. A We put AAk={1≤k≤nmax∣Sk∣≥ε},={∣Si∣<ε,i=1,…,k−1,∣Sk∣≥ε},1≤k≤n,i.e., we break things down according to the time that ∣Sk∣ first exceeds ε. Then Ak∩Aj=∅,j=k and A=⋃k=1nAk,ESn2≥ESn2IA=k=1∑nESn2IAkButESn2IAk=E(Sk+(ξk+1+⋯+ξn))2IAk=ESk2IAk+2ESk(ξk+1+⋯+ξn)IAk+E(ξk+1+⋯+ξn)2IAk≥ESk2IAk,sinceESk(ξk+1+⋯+ξn)IAk=ESkIAk⋅E(ξk+1+⋯+ξn)=0because of independence and the conditions Eξi=0,i≤n. HenceESn2≥k=1∑nESk2IAk≥ε2k=1∑nP(Ak)=ε2P(1≤k≤nmax∣Sk∣≥ε),which proves the inequality (1).
B To prove (2), we observe thatESn2IA=ESn2−ESn2IAˉ≥ESn2−ε2P(Aˉ)=ESn2−ε2+ε2P(A).On the other hand, on the set Ak∣Sk−1∣≤ε,∣Sk∣≤∣Sk−1∣+∣ξk∣≤ε+cand thereforeESn2IA=k=1∑nESk2IAk+k∑E(IAk(Sn−Sk)2)≤(ε+c)2k=1∑nP(Ak)+k=1∑nP(Ak)j=k+1∑nEξj2≤P(A)[(ε+c)2+j=1∑nEξj2]=P(A)[(ε+c)2+ESn2].Then we can subtract ε2P(A) from both sides of (4) and getESn2IA−ε2P(A)≤P(A)[(ε+c)2+ESn2−ε2]Finally, using (3) and (5) we obtainP(A)≥(ε+c)2+ESn2−ε2ESn2−ε2=1−(ε+c)2+ESn2−ε2(ε+c)2≥1−ESn2(ε+c)2.This completes the proof of (3). □ References
- Shiryaev A. N., ---Probability---. Springer-Verlag, Berlin Heidelberg, 1996 (English translation).