9

Let $A$ be a stochastic matrix. Then \begin{align*} \lim_{t \rightarrow\infty} A^t \end{align*}

may not exist. For example: \begin{align*} A &= \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix} \\ A^{2t} &= I \\ A^{2t+1} &= A \end{align*}

Now define the Cesàro limit $A^\infty$ of $A$ to be \begin{align*} \lim_{t \rightarrow \infty} \frac{1}{t} \sum_{k=0}^{t-1} A^k \end{align*}

Then, for the above example, \begin{align*} A^\infty = \begin{bmatrix} \frac{1}{2} & \frac{1}{2} \\ \frac{1}{2} & \frac{1}{2} \end{bmatrix} \end{align*}

Intuitively, $A^\infty$ represents the long-run average amount of time spent in each state of the Markov chain described by $A$. My question is this: Does every (finite) stochastic matrix have a Cesàro limit? If so, what is the most efficient algorithm for finding this limit?

According to this article, $R^2 = R = RA = AR$ and rank $R \geq $ rank $A^\infty$ implies $R = A^\infty$.

It appears that the rows of $A^\infty$ are the normalized eigenvectors of $A^\top$ that have a corresponding eigenvalue of 1. How does one determine the correct order and repetition of such eigenvectors, algorithmically?

EDIT: According to this article, the Cesàro limit is guaranteed to exist and is equal to the eigenprojection for the eigenvalue 1 of $A$.

EDIT 2: According to this article,

$$A^\infty = X (Y^* X)^{-1} Y^*$$

where $X$ are the eigenvectors of $A$ with eigenvalue 1 and $Y$ are the eigenvectors of $A^\top$ with eigenvalue 1. I generally get the right result with this approach but sometimes numerical errors seem to result in the wrong Cesàro limit. Is there a more numerically stable approach?

user76284
  • 6,408

3 Answers3

4

In general, for a matrix $T\in\mathbb{C}^{d\times d}$ the following are equivalent:

  1. $\sup_{n\in\mathbb{N}} \lVert T^n \rVert < \infty$,
  2. $\frac{1}{n} \sum_{j=1}^n T^j$ converges as $n\to\infty$.

This is consequence of e.g. the Jordan normal form of $T$ (see exercise 5 of Operator Theoretic Aspects of Ergodic Theory by Eisner et al, as well as Theorem 8.5 and Theorem 8.22, which shows the same for any power-bounded linear operator on a reflexive Banach space).

If $T$ is a stochastic matrix, so is $T^n$, and the operator norm $\lVert T^n \rVert$ is bounded by the Frobenius norm $\lVert T^n \rVert_\mathrm{F} < d$, hence 1. (and 2.) is the case.

heiner
  • 1,178
1

We have the following theorem.

Theorem Given a finite dimensional normed vector space $E$ and a linear map $T : E \rightarrow E$ satisfying $\|T\| \leq 1$ for the associated operator norm. Then $$ E = \ker(T-id) \oplus \hbox{im}(T-id), $$ $$ {1\over n} \sum_{k=0}^{n-1} T^n(x) \longrightarrow Q $$ where $Q$ is the projector onto $\ker(T-id)$ parallel to $Im(T-id)$.

The relevant norm on $E = {\bf R}^n$ for stochastic matrices is the norm $\|(x_1,..., x_n)|\ = \sum_i |x_i|$, in which case the matrix has a norm equal to $1$.

So to compute the limit $Q$, you only need to find $ker(T-id)$ which is the eigenspace associated to the eigenvalue $1$ and $Im(T-id)$, which can be described using the row echelon form for the matrix.

coudy
  • 6,536
0

As any entry of $A^k$ is between $0$ and $1$, the sequence $$S_t=\frac1t \sum_{k=0}^{t-1}A^k$$ consists of matrices with entries bounded between $0$ and $1$, and each entry is monotone (weakly) increasing in $t$. So by the monotone sequence theorem, the Cesàro limit exists for any stochastic matrix $A$.

I have no idea what to do about efficiently computing this limit.

heiner
  • 1,178
Kusma
  • 4,099
  • Can you explain why the entries are monotone increasing in $t$? It seems to me that's not true in general, e.g. for $A = \pmatrix{\varepsilon & 1 - \varepsilon \ 1 - \varepsilon & \varepsilon}$, we have $S_0 = A^0 = E$ the identity matrix but $S_1 = 0.5(E + A) = 0.5\pmatrix{1 + \varepsilon & 1 - \varepsilon \ 1 - \varepsilon & 1 + \varepsilon}$ which is decreasing at e.g. the top left entry for $\varepsilon < 1$? – heiner Feb 17 '22 at 15:38
  • 1
    You are right, I don't know what I was thinking when I wrote that. – Kusma Feb 18 '22 at 15:47
  • Thanks for getting back to this after 4 years. I've added an alternative argument. – heiner Feb 18 '22 at 16:49