3

Let $S_1,S_2,\dots$ be series of independent variables with exponential distributions with parameters $\lambda_i>0$. Show that $$ \sum_{i=0}^\infty \frac{1}{\lambda_i} =\infty \Rightarrow P\left(\sum_{i=0}^\infty S_i = \infty\right)=1.$$

I know that $E(S_i)=\frac{1}{\lambda_i}$, so it feels kind of intuitive that this is right. My idea was to look at $\lim_{n\rightarrow \infty}\{\sum_{i=1}^{n} S_i>M\}$ for any $M>0$ and show that it converges to 1, but the distribution of $S_1+\dots+S_n$ is hard to find. Is there a better way to approach this problem?

szymji
  • 59
  • It could help to take advantage of the fact that $T_n=\sum_{i=0}^nS_i$ are the jump times of an inhomogeneous Poisson process. For the homogeneous case (all $\lambda$ equal) some calculations are done here. – Kurt G. Mar 02 '22 at 02:15

1 Answers1

1

CASE I:

$\frac 1 {\lambda_i}$ is bounded.

$Ee^{-\sum S_n}=\prod \frac {\lambda_i}{\lambda_i+1}$. Conclude that $\prod \frac {\lambda_i}{\lambda_i+1}=\prod (1- \frac 1 {\lambda_i+1})$ diverges since $\sum \frac 1 {\lambda_i+1}$ is bounded. [Here we need the assumption that $\frac 1 {\lambda_i} \to 0$].We have proved that $Ee^{-\sum S_n}=0$ so $\sum S_n=\infty$ a.s.

CASE II

Suppose $\lambda_i \to 0$ along a subsequence $(\lambda_{i_k})$. If the series $\sum S_i$ converges with positive probability then it converges with probability $1$ (by $0-1$ law) and so does the series $\sum S_{i_k}$, and Kolmogorov's Three Series Theorem shows that $\sum P(S_{i_k} >1)$ must converge. But $\sum P(S_{i_k}>1) =\sum e^{-\lambda_{i_k}}$ which is divergent.