I'm a little stuck on how to solve this problem. Suppose I have an ATM that starts with zero dollars. Every morning, a truck will deposit a random amount of dollar bills into the ATM so people can withdraw. The amount of cash deposited is Poisson with mean parameter $\lambda$. The total amount withdrew for the day is binomially distributed with parameter $p$. That is, every dollar in the ATM has a probability $p$ of being withdrawn. Cash that is not withdrawn at the end of the day is left in the ATM. Find the stationary distribution of the amount of cash in the ATM.
I have already shown that the amount of cash in the ATM is a Markov Chain. What I tried was defining the amount of cash in the ATM on day $n$ as a sum of a Binomial and Poisson variables, since the amount of cash on day $n$ is equal to the number of dollar bills not withdrawn from the previous day plus the new cash deposited in the morning. So I had
$$X_n = Binom(X_{n-1}, 1-p) + Poisson(\lambda)$$ $$ X_{n} = Y_{n} + Z_{n} $$ $$ e^{X_{n}} = e^{Y_{n} + Z_{n}} $$ $$ e^{X_{n}} = e^{Y_{n}}e^{Z_{n}} $$ $$ G_{X_n}(s)= G_{Y_{n}}(s)G_{Z_{n}}(s) $$ $$ G_{X_n}(s)= G_{Y_{n}}(s) G_{Z_{n}}(s) $$ $$ G_{X_n}(s)= ((1-p)s + p)^{X_{n-1}} \cdot e^{\lambda (s-1)} $$ $$ G_{X_n}(s)= G_{X_{n-1}}((1-p)s + p) \cdot e^{\lambda (s-1)} $$
Where $G_{X_n}(s) = E[s^{X_n}]$, the probability generating function. I plugged in the recurrence relation to get
\begin{align*} G_{X_n}(s) &= G_{X_{n-1}}((1-p)s + p) \cdot e^{\lambda (s-1)} \\ &= G_{X_{n-2}}((1-p)[(1-p)s + p)] + p) \cdot e^{\lambda ([(1-p)s + p)]-1)} \cdot e^{\lambda (s-1)} \\ &= G_{X_{n-2}}((1-p)^2 s + (1-p)p + p) \cdot e^{\lambda ((1-p)s + p - 1)} \cdot e^{\lambda (s-1)} \\ &= G_{X_{n-3}}((1-p)^3 s + (1-p)^2 p + (1-p)p + p) \cdot e^{\lambda ((1-p)^2 s + (1-p)p + p - 1)} \cdot e^{\lambda ((1-p)s + p - 1)} \cdot e^{\lambda (s-1)} \\ &= G_{X_{n-3}}((1-p)^3 s + (1-p)^2 p + (1-p)p + p) \cdot e^{\lambda ((1-p)^2 s + (1-p)p + p - 1)} \cdot e^{\lambda ((1-p)s + p - 1)} \cdot e^{\lambda (s-1)} \\ &= G_{X_{n-3}}((1-p)^3 s + (1-p)^2 p + (1-p)p + p) \cdot e^{\lambda ((1-p)^2 s + (1-p)p + p - 1)} \cdot e^{\lambda ((1-p)s + p - 1)} \cdot e^{\lambda (s-1)} \\ &\vdots \\ &= \left[p \sum_{i}^{k} (1-p)^i + (1-p)^k\right]^{X_{N-K}} \cdot \prod e^{\lambda ((1-p)^k s + p\sum_{i=0}^{k}(1-p)^i - 1)} \\ \end{align*}
If we take the limit as $K \xrightarrow{} \infty$, then we are left with
$$ G_{X_n}(s) = \prod e^{\lambda ((1-p)^k s + p\sum_{i=0}^{k}(1-p)^i - 1)} $$
My friend told me he got $G_{X_n} = e^{\lambda/p(1-s)}$, which is crazy but works out when you graph it via simulation. Can anybody point out how to continue from here?