Here's my understanding of it: Assume we have an $n\times n$ stochastic matrix $P$ that represents our Markov chain such that $x$ and $y$ are stationary distributions for $P$. Then
$P(x) = x$
$P(y) = y$
$P(ax+by) = P(ax) + P(by) = aP(x) + bP(y)$ where $ax+by$ is a convex linear combination
$ = ax + by$
meaning that $ax+by$ is a stationary distribution, so there is an infinite amount of stationary distributions of P if there are at least 2.
Does this mean a Markov chain either has one or infinitely many stationary distributions?
Assuming we start at a fixed state x, can we end up with multiple stationary distributions then?
– mercury0114 Jun 12 '23 at 12:49