0

I am stuck on the following problem. I believe that my solution is right so far, but I do not know how to finish the problem. Ideally, I would like to do this problem without using moment generating functions or the idea of the convolution. Maybe that is not a realistic goal.
Thanks,
Bob
Problem:
Let $X$ and $Y$ be independent binomial r.v.'s with parameters $(n,p)$ and $(m,p)$, respectively. Let $Z = X + Y$. What is the distribution of $Z$?
Answer:
\begin{eqnarray*} P(Z = k) &=& \sum_{i = 0}^{k} P(X = i)P(Y = k-i) \\ P(Z = k) &=& \sum_{i = 0}^{k} {n \choose i}p^i(1-p)^{n-i} {m \choose {k-i} } p^{k-i}(1- p)^{m -(k-i)} \\ P(Z = k) &=& \sum_{i = 0}^{k} {n \choose i}p^k(1-p)^{n-i} {m \choose {k-i} } (1- p)^{m -k+i} \\ P(Z = k) &=& \sum_{i = 0}^{k} {n \choose i}p^k(1-p)^{n+m-k} {m \choose {k-i} } \\ \end{eqnarray*}

Bob
  • 4,388
  • 3
  • 25
  • 57
  • 2
    Reference: https://math.stackexchange.com/questions/1176385/sum-of-two-independent-binomial-variables – asdf Jul 02 '18 at 10:59

2 Answers2

0

$\sum_{i=0}^{k}\binom n i \binom m {k-i} = \binom {n+m} k$ is a standard equality, use it to finish the proof

0

You can go on with:$$=p^k(1-p)^{n+m-k}\sum_{i=0}^k\binom{n}{i}\binom{m}{k-i}=p^k(1-p)^{n+m-k}\binom{n+m}k$$

Btw, you can also deduce more directly that the sum of two independent binomials with equal parameter $p$ is binomial again.

This based on the fact that a binomial is actually a sum of iid Bernoulli- distributed random variables.

See here for that.

drhab
  • 153,781