Kolmogorov’s Inequality

Statement: Let $X_i$ be independent random variables with mean 0, finite variance and $S_n = X_1 + \ldots + X_n$. Then $P\left( \displaystyle \max_{1 \leq i \leq n} |S_i| \geq a \right) \leq \frac{1}{a^2} Var(S_n)$.

Note that Chebyshev’s inequality already gives us $P\left( |S_n| \geq a \right) \leq \frac{1}{a^2} Var(S_n)$, so this is quite a strengthening.

Proof:

Let $A_i$ be the event that $S_i$ is the first of the $S$’s greater than or equal to $a$. Clearly these are disjoint, and $P\left( \displaystyle \max_{1 \leq i \leq n} |S_i| \geq a \right) = \sum P(A_i)$.

Obviously $E[I_{A_i} S_i^2] \geq a^2 P(A_i)$. So by linearity of expectation, we need to prove that $E[S_n^2] \geq E[\sum I_{A_i}S_i^2]$.

First, $S_n^2 \geq \sum I_{A_i} S_n^2$ since $0 \leq \sum I_{A_i} \leq 1$. Second,

$$\begin{eqnarray}
E[I_{A_i} S_n^2] & = & E[I_{A_i} (S_i + (S_n – S_i))^2 ] \\
& = & E[I_{A_i} S_i^2] + 2E[I_{A_i} S_i (S_n – S_i)] + E[I_{a_i} (S_n – S_i)^2] \\
& = & E[I_{A_i} S_i^2] + 0 + E[I_{A_i} (S_n – S_i)^2] \\
& \geq & E[I_{A_i} S_i^2]
\end{eqnarray}$$

where the third equality holds because $S_i$ and $S_n – S_i$ are independent (they involve different X’s). So $E[S_n^2] \geq E[\sum I_{A_i} S_n^2] \geq E[\sum I_{A_i} S_i^2]$ and the theorem follows.

Kolmogorov’s Two-Series Theorem

Statement: Suppose $X_i$ has mean $\mu_i$ and variance $\sigma_i$, such that $\sum \mu_i$ and $\sum \sigma_i$ converge. Then $\sum X_i$ converges almost surely.

Proof: This follows quickly from Komogorov’s inequality. We may assume WLOG that $\mu_i = 0$ just by subtracting it from $X_i$.

$S_n = \displaystyle \sum_{i=1}^n X_i$ converges iff it Cauchy converges iff for each $\epsilon$, $|S_m – S_n| < \epsilon$ whenever n, m are sufficiently large. This is equivalent to saying $\sup_{m > n} |S_m – S_n| \leq \epsilon$ when n > N (for some sufficiently large N). But $S_m – S_n = X_{n+1} + \ldots + X_m$, so by Kolmogorov’s inequality,

$$\begin{eqnarray}
P\left( \sup_{m > n} |S_m – S_n| \geq \epsilon \right) & = & \lim_{m \to \infty} P\left( \sup_{n < i < m} |S_i – S_n| \geq \epsilon \right) \\
& \leq & \lim_{m \to \infty} \frac{1}{\epsilon^2} Var(X_{n+1} + \ldots + X_m) \\
& = & \frac{1}{\epsilon^2} \sum_{i=n+1}^\infty \sigma_i.
\end{eqnarray}$$

By assumption, this last sum is the tail of a convergent series, hence it’s arbitrarily close to zero when $n$ is suffuciently large. So if we take the limit under $n$ of both sides, we see that

$$
P\left( \lim_{n\to \infty} \sup_{m > n} |S_m – S_n| \geq \epsilon \right) = 0.
$$

So $S_n$ converges almost surely.

This entry was posted in Uncategorized and tagged , . Bookmark the permalink.