Portmanteau Theorem

Statement: $X_n \overset{D}{\to} X$ iff $E[g(X_n)] \to E[g(X)]$ for any bounded, continuous Lipschitz g. We prove this in two parts:

Part 1

Here we show $X_n \overset{D}{\to} X$ implies $E[g(X_n)] \to E[g(X)]$ for any continuous bounded g. Let $F_n$ denote the cdf for $X_n$, and likewise let $F$ be the cdf for $X$.

Pick $\epsilon$, and a continuous g with M as its bound. By assuming convergence, we have $X_n = O_P(1)$, hence for some $R > 0$, we know $P(|X_n| > R) < \epsilon$, and also $P(|X| > R) < \epsilon$. Thus, $\displaystyle \left| \int_{|X_n| > R} g(X_n) dF_n \right| < \epsilon M$, and likewise for X. Since this can be made arbitrarily small, we need only look at convergence on the interval $[-R, R]$.

Let’s approximate g on $[-R, R]$. Since g is continuous, it’s uniformly continuous on this interval, so we can pick $N+1$ points $t_i$ such that $-R = t_0 < t_1 < \ldots < t_N = R$ and also $x, y \in [t_k, t_{k+1}]$ implies $|g(x) – g(y)| < \epsilon$. Furthermore, since F is continuous outside a countable set, we can choose the points such that F is continuous at each $t_i$. Finally, let’s define h, a step function approximation to g, by $x \in [t_k, t_{k+1})$ implies $h(x) = g(t_k)$. (And of course $h(t_N) = g(t_N)$, $h(x) = 0$ everywhere else.)

We need two lemmas:

Lemma 1: $|E[g(Y)] – E[h(Y)]| < \epsilon + \epsilon M$ where $Y \in \{X, X_1, X_2, \ldots\}$.

Proof:

$$
\begin{eqnarray}
|E[g(Y)] – E[h(Y)]| &=& |E[g(Y)1_{|Y|>R}] + (E[g(Y)1_{|Y|\leq R}] – E[h(Y)])| \\
&\leq& |E[g(Y)1_{|Y|>R}]| + \left| \int_{[-R, R]} g(y) – h(y) dY \right| \\
&<& \epsilon M + \epsilon P(|Y| \leq R)\tag{1}\\
&\leq& \epsilon M + \epsilon
\end{eqnarray}
$$

where (1) follows since we’ve constructed h such that $|g(x) – h(x)| < \epsilon$ for $x \in [-R, R]$. $\blacksquare$

Lemma 2: $E[h(X_n)] \to E[h(X)]$.

Proof: It’s really easy to integrate h, since it’s a step function. We have:

$$
\begin{eqnarray}
|E[h(X_n)] – E[h(X)]| &=& \left| \int_{[-R, R]} h(x) dF_n – \int_{[-R, R]} h(x) dF \right| \\
&=& \left| \left( \sum_{k=0}^N g(t_k)(F_n(t_{k+1}) – F_n(t_k)) \right) – \left(  \sum_{k=0}^N g(t_k)(F(t_{k+1}) – F(t_k)) \right)\right| \\
&\leq& \sum_{k=0}^N |g(t_k)(F_n(t_{k+1}) – F(t_{k+1})) +g(t_k)(F_n(t_k) – F(t_k)) |.
\end{eqnarray}
$$

Since there are only finitely many $t_i$’s, and since each $t_i$ is a continuity point of F, and since $F_n \to F$ at each continuity point, it follows that this last quantity tends to 0 as n approaches infinity. $\blacksquare$

Finally, note that

$$
|E[g(X_n) – E[g(X)]| \leq |E[g(X_n)] – E[h(X_n)]| + |E[h(X_n)] – E[h(X)]| \\ + |E[h(X)] – E[g(X)]|
$$

and our two lemmas show that each of the three terms on the right can be made arbitrarily small for large n by picking suitably close approximations h. $\blacksquare$

Part 2

Let $F_n$ and $F$ be the cdf’s as above. Suppose $E[g(X_n)] \to E[g(X)]$ for any bounded, Lipschitz g. We must show that $F_n \to F$ at any continuity point of F. Let x be such a continuity point.

Pick $\epsilon > 0$. Let’s define two Lipschitz continuous functions $a$ and $b$ as follows:

  • Let $a(y) = 1$ for $y \leq x – \epsilon$, $a(y) = 0$ for $y \geq x$, and $a$ is linear on $[x – \epsilon, x]$.
  • Let $b(y) = 1$ for $y \leq x$, $b(y) = 0$ for $y \geq x + \epsilon$, and $b$ is linear on $[x, x + \epsilon]$.

So we have

$$
\int a dF_n \leq \int 1_{X_n \leq x} dF_n = F_n(x) \leq \int b dF_n.
$$

Using the second and third parts here and the convergence of the third part, we get $\displaystyle \limsup_{n\to\infty} F_n(x) – F(x) \leq \int_{[x, x+\epsilon]} dF = F(x+\epsilon) – F(x)$. Likewise, using $a$, we get $\displaystyle \liminf_{n\to\infty} F_n(x) – F(x) \geq F(x – e) – F(x)$. But since F is continuous at x, these can both be made arbitrarily small as $\epsilon$ decreases. Hence $F_n(x) \to F(x)$. $\blacksquare$

Notes: Why is this called the Portmanteau Theorem? I can’t find any references.

This entry was posted in Uncategorized and tagged , . Bookmark the permalink.