2

I have $n$ birth and death processes. Each of them are machines that are either working or failed. When working, the time until next failure is exponential with rate $\lambda$. And when any machine fails, the time until recovery is exponential with rate $\mu$. Now, I observe $n$ of these machines for a time period, $t$. I'm interested in $M(t)$, the maximum number of machines that are failed at any instant within the interval, $t$. What is $E(M(t))$? How does it grow with $t$? Is it $O(t)$, or $O(\log(t))$ or none? My intuition says it can't be $O(1)$ since when we give the machines a longer time, more of them will tend to fail together, but also shouldn't be as bad as $O(t)$.


EDIT: thought about it, we can consider dividing $t$ into small intervals so that each interval, any machine is either working or down. The probability of working will be $p=\frac{\lambda}{\lambda+\mu}$. Then, the number of machines down in any interval becomes binomial. Now, we're taking about taking the max of $k$ binomials and as shown here: Expected value of the maximum of binomial random variables, this increases as $\sqrt{\log(k)}$. Interesting to see if someone can come up with the exact expression. Also, I didn't understand how @cdipaolo came up with the "asymptotically correct bound" in that case.

Rohit Pandey
  • 7,547
  • 2
    I think using "birth and death processes" is a misnomer: birth and death processes are Markov processes on $\mathbb Z^+$, which in continuous time have jumps of $+1$ and $-1$, and $0$ is a absorbing state. Here you have just two-state Markov chains, which switch between $0$ and $1$. – zhoraster Feb 06 '20 at 07:41
  • Per the question itself: if $X(t)$ denotes the number of machines which are off at time $t$, then $M(t) = \max_{s\le t} X(s)$, is this correct? But then $\mathrm E[M(t)]\to n, t\to\infty$. – zhoraster Feb 06 '20 at 07:49
  • That's right. I want to know how it increases with t. Good point that it has to converge to n – Rohit Pandey Feb 06 '20 at 13:05

0 Answers0