2

I was learning Hidden Markov model, and encountered this theory about convergence of Markov model.

For example, consider a weather model, where on a first-day probability of weather being sunny was 0.9 while that of being rainy - 0.1. The transition distribution probabilities were - probability of sunny on current day given weather was sunny on previous day - i.e P(sun/sun) = 0.9. Similarly other probabilities were - P(sun/rain) = 0.3, P(rain/rain) = 0.7, and P(rain/sun) = 0.1.

Using bayes inference, we can find probability of distribution of weather on day Xt. It was mentioned that, after some point (i.e at state t = infinite), the probability distribution of weather on a day converges to p(sun) = 0.75 and p(rain) = 0.25.

Can someone explain this convergence mathematically? How does it converge to this distribution? It doesn't matter what's the distribution on day 1, the model always converges to the same values.

1 Answers1

1

This isn't a hidden Markov model; this is an ordinary Markov model. Take a look at Wikipedia's article on Markov chains and specifically the notion of a steady-state distribution (or stationary distribution), or read about the subject in your favorite textbook -- there are many that cover Markov chains.

D.W.
  • 167,959
  • 22
  • 232
  • 500