0

Let

$S=\sum_{i=1}^{N} X_i$ where

$N \sim \mathrm{Poisson}(\mu)$

$X_i$ iid $\sim \mathrm{Exponential}(\lambda)$ and independent of $N$

I want to determine $\mathbb P[S=0]$ and $\mathbb P[a<S\leq b]$ for $0\leq a<b$.

I don't know how or where I should or could use the following hint, but the author states to use the series representation of the Bessel function $I_\nu$, i.e.

$$I_\nu(z)=(z/2)^\nu\sum_{k=0}^{\infty}\frac{(z/2)^{2k}}{k!\Gamma(v+k+1)}$$

I would appreciate any help to solve this problem.

StubbornAtom
  • 17,932
  • Since all $X_i$ are positive with probability $1$, you have $\mathbb P[S=0]=\mathbb P[N=0]$ – Henry Oct 29 '20 at 10:23
  • @Henry Thank you. Do you have an idea how to get the probability of $\mathbb P[a<S\leq b]$? (using the bessel function) – user826130 Oct 29 '20 at 10:28

2 Answers2

1

A completely different approach (which is why I though a new answer is necessary), is to use some properties of the conditional probability measure ($F_{\Gamma(k,1/\lambda)},f_{\text{Pois}(\mu)}(k)$ denote the CDF of the Gamma-distribution and the PMF of the Poisson-distribution respectively): \begin{align*}\textbf{P}(S\leq x)=\textbf{E}[\textbf{P}(S\leq x|N)]&=\textbf{E}[\textbf{P}(\sum^N_{i=1}X_i\leq x|N)]\\&=\textbf{E}[F_{\Gamma(N,1/\lambda)}(x)]=\sum^\infty_{k=0}F_{\Gamma(k,1/\lambda)}(x)f_{\text{Pois}(\mu)}(k)\end{align*}

In this case, we have used Adam's Law and the fact that the sum of independent identically Gamma-distributed random variables is again Gamma-distributed. This solution requires a bit more knowledge about conditional probabilities. In fact I am not sure if it works (would be great if someone could double check). It does look similar to your hint thought!

0

Since $\textbf{P}[S\in(a,b]]$ uniquely determines the distribution of $S$, you might as well calculate the distribution of $S$.

When dealing with these kinds of sums, some knowledge about moment generating functions (MGFs) can be very usefull:

If it exists (in your case it does), the MGF of $S$ is determined by $M_S(t):=\textbf{E}[e^{tS}]$ where $M_S$ has to be well defined on some interval around zero. If this is the case, the distribution of $S$ is uniquely determined by its MGF. Since the MGFs of your to random variables are known (google), you can use them to easily calculate: \begin{align*}M_S(t)=\textbf{E}[e^{tS}]&=\textbf{E}[e^{t\sum^N_{i=1}X_i}]=\textbf{E}[\textbf{E}[e^{t\sum^N_{i=1}X_i}|N]]\\&=\textbf{E}[\prod^N_{i=1}\textbf{E}[e^{tX_i}]]=\textbf{E}[M_{X_1}(t)^N]=M_N(\ln(M_{X_1}(t))).\end{align*}

You can now either guess the distribution of $S$ and show that its MGF is indeed the right one or determine the distribution of $S$ using its MGF, allthough thats a bit more complicated. You probably need the hint for that.

  • So we have $M_S(t)=e^{\lambda(\frac{\lambda}{\lambda-t})-1}$. Is that the mgf of a well known distribution? At least I can't guess the distribution of S – user826130 Oct 29 '20 at 11:19
  • I think you are missing a bracket. It should be $e^{\lambda(\lambda/(\lambda-t)-1)}$. Unfortunately, I don't know this distribution either. If you want to calculate the distribution of a random variable using its MGF, this could help: https://math.stackexchange.com/questions/343930/calculate-probability-density-function-from-moment-generating-function – MatheMartin Oct 29 '20 at 12:24
  • The mgf of the Erlang distribution is $M(t)=(\frac{\lambda}{\lambda-t})^n$ which looks sort of similar. Does that get us any further? – user826130 Nov 01 '20 at 14:41