Let $P$ be the stochastic matrix associated with a reversible Markov chain and $\pi$ its stationary distribution. Then it can be seen that $DPD^{-1}$ where $D=\text{diag}(\sqrt{\pi(x)})$ is symmetric and hence its eigenvalues are real and its eigenvectors span $\mathbb{R}^N$ where $N$ is the size of the state space. Now the Perron-Frobenius theorem tells us that $P$ has a unique eigenvector with eigenvalue $1$ which after normalizing is $\pi$. This PDF (section 10.3) claims that when we write our initial probability distribution $p^{(0)}$ in terms of the eigenvectors $e_i$ of $P$ we have $p^{0}=\sum\alpha_ie_i$ where $e_1=\pi$ and $\alpha_1=1$. I don't understand why we should have $\alpha_1=1$.
Asked
Active
Viewed 99 times
1 Answers
3
Well, $p^{(t)}$ converges to $\alpha_1 \pi$. As each $p^{(t)}$ is a probability distribution on a finite set (its entries sum to $1$) this means $\alpha_1 \pi$ is, so we must have $\alpha_1 = 1$.
Matthew Bolan
- 431