3

I'm a little stuck on how to solve this problem. Suppose I have an ATM that starts with zero dollars. Every morning, a truck will deposit a random amount of dollar bills into the ATM so people can withdraw. The amount of cash deposited is Poisson with mean parameter $\lambda$. The total amount withdrew for the day is binomially distributed with parameter $p$. That is, every dollar in the ATM has a probability $p$ of being withdrawn. Cash that is not withdrawn at the end of the day is left in the ATM. Find the stationary distribution of the amount of cash in the ATM.

I have already shown that the amount of cash in the ATM is a Markov Chain. What I tried was defining the amount of cash in the ATM on day $n$ as a sum of a Binomial and Poisson variables, since the amount of cash on day $n$ is equal to the number of dollar bills not withdrawn from the previous day plus the new cash deposited in the morning. So I had

$$X_n = Binom(X_{n-1}, 1-p) + Poisson(\lambda)$$ $$ X_{n} = Y_{n} + Z_{n} $$ $$ e^{X_{n}} = e^{Y_{n} + Z_{n}} $$ $$ e^{X_{n}} = e^{Y_{n}}e^{Z_{n}} $$ $$ G_{X_n}(s)= G_{Y_{n}}(s)G_{Z_{n}}(s) $$ $$ G_{X_n}(s)= G_{Y_{n}}(s) G_{Z_{n}}(s) $$ $$ G_{X_n}(s)= ((1-p)s + p)^{X_{n-1}} \cdot e^{\lambda (s-1)} $$ $$ G_{X_n}(s)= G_{X_{n-1}}((1-p)s + p) \cdot e^{\lambda (s-1)} $$

Where $G_{X_n}(s) = E[s^{X_n}]$, the probability generating function. I plugged in the recurrence relation to get

\begin{align*} G_{X_n}(s) &= G_{X_{n-1}}((1-p)s + p) \cdot e^{\lambda (s-1)} \\ &= G_{X_{n-2}}((1-p)[(1-p)s + p)] + p) \cdot e^{\lambda ([(1-p)s + p)]-1)} \cdot e^{\lambda (s-1)} \\ &= G_{X_{n-2}}((1-p)^2 s + (1-p)p + p) \cdot e^{\lambda ((1-p)s + p - 1)} \cdot e^{\lambda (s-1)} \\ &= G_{X_{n-3}}((1-p)^3 s + (1-p)^2 p + (1-p)p + p) \cdot e^{\lambda ((1-p)^2 s + (1-p)p + p - 1)} \cdot e^{\lambda ((1-p)s + p - 1)} \cdot e^{\lambda (s-1)} \\ &= G_{X_{n-3}}((1-p)^3 s + (1-p)^2 p + (1-p)p + p) \cdot e^{\lambda ((1-p)^2 s + (1-p)p + p - 1)} \cdot e^{\lambda ((1-p)s + p - 1)} \cdot e^{\lambda (s-1)} \\ &= G_{X_{n-3}}((1-p)^3 s + (1-p)^2 p + (1-p)p + p) \cdot e^{\lambda ((1-p)^2 s + (1-p)p + p - 1)} \cdot e^{\lambda ((1-p)s + p - 1)} \cdot e^{\lambda (s-1)} \\ &\vdots \\ &= \left[p \sum_{i}^{k} (1-p)^i + (1-p)^k\right]^{X_{N-K}} \cdot \prod e^{\lambda ((1-p)^k s + p\sum_{i=0}^{k}(1-p)^i - 1)} \\ \end{align*}

If we take the limit as $K \xrightarrow{} \infty$, then we are left with

$$ G_{X_n}(s) = \prod e^{\lambda ((1-p)^k s + p\sum_{i=0}^{k}(1-p)^i - 1)} $$

My friend told me he got $G_{X_n} = e^{\lambda/p(1-s)}$, which is crazy but works out when you graph it via simulation. Can anybody point out how to continue from here?

  • Just a quick question: Why are you adding the Poisson and Binomial random variables? If there is $X_n=20$ dollars in the ATM before the truck full of cash comes and the truck deposits $D_n=50$ dollars into the ATM, then the amount of money that is withdrawn at the end of the day is $W_n\sim \text{Binomial}(X_n+D_n,p)$ so the amount of money in the ATM the next morning is $$X_{n+1}=X_n+D_n-W_n$$ –  Feb 06 '21 at 22:30
  • @MatthewPilling I think we're talking about different times. I measure the amount of money in the ATM after the truck has dropped off the cash. So the amount of money in the ATM before the truck has dropped off the cash is the number of dollar bills that were not withdrawn the previous day. The probability of a dollar bill not being withdrawn is (1-p), which means that the sum of bernoulli variables would be Binomial(X_n, 1-p). – Alex Peniz Feb 06 '21 at 22:35
  • Okay. You write in your post that $$X_n=\text{Binom}(X_{n-1},p)+\text{Poisson}(\lambda)$$ I think your $p$ should be a $1-p$. You also write in the beginning of your post that the amount withdrawn each day is geometric when I think you mean to write that it's binomial. –  Feb 06 '21 at 22:38
  • Got it, just fixed it – Alex Peniz Feb 06 '21 at 22:39
  • The amount of cash the driver deposits is poisson. – Alex Peniz Feb 06 '21 at 22:49
  • 1

1 Answers1

2

Given a random variable $X$ let $M_{X}(t)=\mathbb{E}\big(e^{tX}\big)$ and $G_{X}(t)=\mathbb{E}\big(t^{X}\big)$. Evidently we have $G_{X}(t)=M_{X}\big(\ln(t)\big)$.

You have $Y_{n}|X_{n-1}\sim \text{Binomial}(X_{n-1},1-p)$ so with this and the total law of expectation $$M_{Y_n}(t)=\mathbb{E}(e^{tY_n})=\mathbb{E}\Big(\mathbb{E}(e^{tY_n}|X_{n-1})\Big)=\mathbb{E}\Big(\big((1-p)e^t+p\big)^{X_{n-1}}\Big)$$ In other words, $$M_{Y_n}(t)=G_{X_{n-1}}\big((1-p)e^t+p\big)=M_{X_{n-1}}\Big(\ln\big((1-p)e^t+p\big)\Big)$$ Combining this result with this we get $$M_{X_n}(t)=M_{Y_n}(t)M_{Z_n}(t)=M_{X_{n-1}}\Big(\ln\big((1-p)e^t+p\big)\Big)\cdot e^{\lambda(e^t-1)}$$ It's relatively easy to show that $$M_{X_{n-k}}\Big(\ln\big((1-p)^ke^t+1-(1-p)^k\big)\Big)=M_{X_{n-k-1}}\Big(\ln\big((1-p)^{k+1}e^t+1-(1-p)^{k+1}\big)\Big)e^{\lambda (1-p)^k (e^t-1)}$$ Combining the previous two lines yields $$M_{X_n}(t)=M_{X_0}\Big(\ln\big((1-p)^n e^t +1-(1-p)^n\big)\Big)e^{\frac{\lambda(e^t-1)(1-(1-p)^n)}{p}}$$ If $n$ is large then $$M_{X_n}(t)\approx M_{X_0}(0)e^{\frac{\lambda(e^t-1)}{p}}=e^{\frac{\lambda(e^t-1)}{p}}$$ We now see the limiting distribution of $X_n$ is $\text{Poisson}\Big(\frac{\lambda}{p}\Big)$. Note $M_{X_0}(0)=1$ since $X_0 \sim \text{Poisson}(\lambda)$