Strong Law of Large Numbers

Truncation Lemma

Statement: Let $X_n$ be a sequence of IID variables with finite mean, and let $Y_n$ = $X_n I_{|X_n| \leq n}$. Then $\sum \frac{Var(Y_n)}{n^2} < \infty$.

Proof:

Lemma: If $c \geq 0$, then $c^2 \sum_{n=1}^\infty \frac{I_{|c| \leq n}}{n^2} \leq 1 + c$.

Proof: The theorem is clearly true for $c = 0$. Otherwise, let $d = ceil(c)$. Then $d > 0$ and so

$$
\begin{eqnarray}
\sum_{n=1}^\infty \frac{I_{c \leq n}}{n^2}
& = & \sum_{n=1}^\infty \frac{I_{d \leq n}}{n^2} \\
& = & \sum_{n=d}^\infty \frac{1}{n^2} \\
& \leq & \frac{1}{d^2} + \int_{d}^\infty \frac{1}{x^2} dx \\
& = & \frac{1}{d^2} + \frac{1}{d}.
\end{eqnarray}
$$

 Multiplying through by $c^2$ and noting that $\frac{c}{d} \leq 1$ gives us the result. $\blacksquare$

Thus we have

$$
\begin{eqnarray}
\sum \frac{Var(Y_n)}{n^2}
& \leq & \sum \frac{E(Y_n^2)}{n^2} \\
& = & \sum \frac{E(X_n^2 I_{|X_n| \leq n})}{n^2} \\
& = & \sum \frac{E(X_1^2 I_{|X_1| \leq n})}{n^2} \\
& = & \sum E\left(\frac{X_1^2 I_{|X_1| \leq n}}{n^2}\right) \\
& = & E\left(\sum \frac{X_1^2 I_{|X_1| \leq n}}{n^2}\right) \tag{1}\\
& = & E\left(|X_1|^2 \sum \frac{I_{|X_1| \leq n}}{n^2}\right) \\
& \leq & E( 1 + |X_1| ) \tag{2} < \infty
\end{eqnarray}
$$

where (1) follows by the Monotone Convergence Theorem and (2) follows by the above Lemma. $\blacksquare$

Kolmogorov’s Strong Law

Statement: Suppose $X_i$ are independent with finite mean $\mu$ and $\sum \frac{Var(X_n)}{n^2} < \infty$. Define $S_n = \frac{X_1 + \ldots + X_n}{n}$. Then $S_n \to \mu$ almost surely.

Proof: We may assume WLOG that $\mu = 0$ by subtracting it from the $X_i$’s. We must thus show $S_n \to 0$ almost surely.

Lemma: If $a_1, a_2, \ldots$ is a sequence of non-negative real numbers and $\sum a_n/n \leq \infty$, then the average of the first n $a_i$’s converges to 0.

Proof: Pick $\epsilon > 0$. Let $S_n = \sum_{k=1}^n a_k/k$ and $A_n = \frac{a_1 + … + a_k}{k}$. Also, let $T_n = \sum_{k=n}^\infty a_k/k$. Since $S_n$ converges to a finite number, $T_n$ converges to 0, hence we can choose some number K such that k > K implies $T_k < \frac{\epsilon}{2}$. Now, observe that n > K implies

$$
\begin{eqnarray}
A_n &=& \frac{a_1 + \ldots + a_K}{n} + \sum_{i=K+1}^n \frac{a_i}{n} \\
&\leq& \frac{a_1 + \ldots + a_K}{n} + \sum_{i=K+1}^n \frac{a_i}{i} \tag{1} \\
&\leq& \frac{a_1 + \ldots + a_K}{n} + T_{K+1} \\
&<& \frac{a_1 + \ldots + a_K}{n} + \frac{\epsilon}{2}
\end{eqnarray}
$$

where (1) follows by the non-negativity of the $a_i$’s. But the left term also converges to zero, hence it’s also bounded by $\frac{\epsilon}{2}$ for sufficiently large n. Thus $A_n < \epsilon$ for large n. $\blacksquare$

Now, let $B_n = \sum_{i=1}^n \frac{X_i}{i}$. Then by our assumption, $Var(B_n)$ converges to some finite number, hence by Kolmogorov’s Two-Series Theorem, $B_n$ converges almost surely. But if $B_n$ converges, our lemma tells us that $S_n$ converges to zero. $\blacksquare$

 

Strong Law of Large Numbers

Statement: Suppose $X_i$ are IID with finite mean $\mu$. Then $S_n = \frac{X_1 + \ldots + X_n}{n}$ converges to $\mu$ almost surely.

Proof: Let $Y_i$ be the truncation of $X_i$ as in the first theorem. Define $Y_n^* = Y_n – E[Y_n]$,$T_n = \frac{Y_1 + \ldots + Y_n}{n}$, $T_n^* = \frac{Y_1^* + \ldots + Y_n^*}{n}$. Since $Var(Y_n^*) = Var(Y_n)$, the truncation lemma tells us that $\sum \frac{Var(Y_i^*)}{i^2}$ is finite, and since the $Y_i^*$’s all have mean 0, Kolmogorov’s strong law tells us $T_n^*$ converges almost surely to 0. Since $E[Y_n] \to \mu$, it follows that $\frac{E[Y1] + \ldots + E[Y_n]}{n}$ also converges to $\mu$. Hence $T_n$ converges almost surely to $\mu$.

Now, $P(Y_n \neq X_n) = P(|X_n| > n) \leq P(|X_n| \geq n) = P(|X_1| \geq n)$. But $\sum_i P(|X_1| \geq i) \leq E[|X_1|] \leq \infty$. Thus by Borel-Cantelli, almost surely $X_n = Y_n$ for sufficiently large $n$. It’s not difficult to see that $S_n – T_n \overset{a.s.}{\to} 0$ follows. Hence $S_n \overset{a.s.}{\to} \mu$. $\blacksquare$

This entry was posted in Uncategorized and tagged , . Bookmark the permalink.