Let $A$ be a stochastic matrix. Then \begin{align*} \lim_{t \rightarrow\infty} A^t \end{align*}
may not exist. For example: \begin{align*} A &= \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix} \\ A^{2t} &= I \\ A^{2t+1} &= A \end{align*}
Now define the Cesàro limit $A^\infty$ of $A$ to be \begin{align*} \lim_{t \rightarrow \infty} \frac{1}{t} \sum_{k=0}^{t-1} A^k \end{align*}
Then, for the above example, \begin{align*} A^\infty = \begin{bmatrix} \frac{1}{2} & \frac{1}{2} \\ \frac{1}{2} & \frac{1}{2} \end{bmatrix} \end{align*}
Intuitively, $A^\infty$ represents the long-run average amount of time spent in each state of the Markov chain described by $A$. My question is this: Does every (finite) stochastic matrix have a Cesàro limit? If so, what is the most efficient algorithm for finding this limit?
According to this article, $R^2 = R = RA = AR$ and rank $R \geq $ rank $A^\infty$ implies $R = A^\infty$.
It appears that the rows of $A^\infty$ are the normalized eigenvectors of $A^\top$ that have a corresponding eigenvalue of 1. How does one determine the correct order and repetition of such eigenvectors, algorithmically?
EDIT: According to this article, the Cesàro limit is guaranteed to exist and is equal to the eigenprojection for the eigenvalue 1 of $A$.
EDIT 2: According to this article,
$$A^\infty = X (Y^* X)^{-1} Y^*$$
where $X$ are the eigenvectors of $A$ with eigenvalue 1 and $Y$ are the eigenvectors of $A^\top$ with eigenvalue 1. I generally get the right result with this approach but sometimes numerical errors seem to result in the wrong Cesàro limit. Is there a more numerically stable approach?