Let M be a matrix, and suppose $v_1, \ldots, v_n$ for a basis for a linearly independent subspace of M. This means
$$
Mv_1 = a_{11}v_1 + a_{12}v_2 + \ldots + a_{1n}v_n \\
Mv_2 = a_{21}v_1 + a_{22}v_2 + \ldots + a_{2n}v_n \\
\vdots \\
Mv_n = a_{n1}v_1 + a_{n2}v_2+ \ldots + a_{nn}v_n
$$
or in other words
$$
M [ v_1 v_2 \ldots v_n] = [v_1 v_2 \ldots v_n]
\begin{bmatrix}
a_{11} & a_{21} & \ldots & a_{1n} \\
a_{12} & a_{22} & \ldots & a_{2n} \\
\vdots & \vdots & \ddots & \vdots \\
a_{1n} & a_{2n} & \ldots & a_{nn} \\
\end{bmatrix}
$$
or in other words $MV = VA$ (where $V$ and $A$ are what you expect). Let $U$ be a matrix such that the columns of $V$ and $U$ form a complete basis for the space. Then
$$
M[V \ U] = [V \ U]
\begin{bmatrix}
A & B \\
0 & C \\
\end{bmatrix}
$$
for some $B$ and $C$. This is because each column of $VB + UC$ is an arbitrary linear combination of the vectors in $V$ and $U$ – and since they form a basis, these combinations can be chosen to match the columns of $MU$.
So, if X is a matrix whose first n columns vectors form an invariant subspace of M, we can write
$$
X^{-1}MX =
\begin{bmatrix}
A & B \\
0 & C \\
\end{bmatrix}
$$
i.e., M is block triangular with respect to X.
Now suppose we can partition $R^m$ into invariant subspaces $C_1, C_2, \ldots, C_n$ of M. As before, we have $MC_i = C_iA_i$, so
$$
\begin{eqnarray}
M[C_1 \ C_2 \ \ldots \ C_n] &=& [MC_1 \ MC_2 \ \ldots \ MC_n] \\
&=& [C_1A_1 \ C_2A_2 \ \ldots \ C_nA_n] \\
&=& [C_1 \ C_2 \ \ldots \ C_n]
\begin{bmatrix}
A_1 & & & \\
& A_2 & & \\
& & \ddots & \\
& & & A_n \\
\end{bmatrix}.
\end{eqnarray}
$$
So M is block diagonal with respect to $[C_1 \ C_2 \ \ldots \ C_n]$.