0

$E(X) = np$ if $X$ is a binomial random variable is the statement I have to prove.

The definition of a binomial random variable tells us that $$P(X=h) = {n \choose k}p^h(1-p)^{n-h}$$

And the definition of expectation for a random variable defined on a sample space $(S,P) $ is $$E(X) = \sum_{a\in \mathbb{R}}aP(X=a)$$

So using that definition I calculate $$0{n\choose 0}p^0(1-p)^{n} + 1{n\choose 1}p^1(1-p)^{n-1}+2{n\choose 2}p^2(1-p)^{n-2}+...+n{n\choose n}p^n(1-p)^{0} = \sum_{k=0}^nk{n\choose k}p^k(1-p)^{n-k}$$

So applying the binomial theorem (with $x=p-1$ and $y=p$) seems obvious, since the binomial theorem says that $$\sum_{k=0}^n{n\choose k}y^kx^{n-k} = (x+y)^n$$

But I can't seem to reconcile this with the result I was trying to prove, which at this point would be proved if I could show that $$\sum_{k=0}^nk{n\choose k}p^k(1-p)^{n-k} = np$$

Does anyone have any hints or ideas as to where to go from here?

StubbornAtom
  • 17,932
James Ronald
  • 2,351
  • A google search would have given you what you were looking for. Also https://en.wikipedia.org/wiki/Binomial_distribution#Expectation. – StubbornAtom Oct 19 '19 at 16:49
  • The sum of independent Bernoulli random variables $X_{1},X_{2},\ldots, X_{n}$ with parameter $p$ is a Binomial random variable $X$ with parameters $n$ and $p$. Since $\textbf{E}(X_{k}) = p$ and $\textbf{Var}(X_{k}) = p(1-p)$, we conclude that $\textbf{E}(X) = np$ and $\textbf{Var}(X) = np(1-p)$. – user0102 Oct 19 '19 at 17:28

3 Answers3

1

One strategy is to consider the function $G_X(t):=\sum_{k=0}^n(pt)^k(1-p)^{n-k}$, which you may have noted is the expectation of $t^X$. (This is called the probability-generating function of $X$, since its $t^k$ coefficient is $P(X=k)$.) Then $G_X^\prime(1)=E\left.(Xt^{X-1})\right|_{t=1}=E(X)$. In this case,$$G_X(t)=(1-p+pt)^n\implies G_X^\prime(t)=np(1-p+pt)^{n-1}\implies G_X^\prime(t)=np.$$But while PGFs are worth knowing about, there's a much easier solution to the problem. Since $X$ is the sum of $n$ variables that are each $1$ with probability $p$ and $0$ otherwise, they each have mean $p$, and their sum has mean $np$.

J.G.
  • 118,053
1

Note that $$k\binom{n}{k}p^k(1-p)^{n-k} = np \binom{n-1}{k-1}p^{k-1}(1-p)^{(n-1)-(k-1)}$$

Michael Biro
  • 13,847
1

You only need to realize the following:

  • $k\binom{n}{k} = n \binom{n-1}{k-1}$
  • Use this and take out $np$
  • Then, apply the binomial theorem with $\color{blue}{n-1}$

\begin{eqnarray*}\sum_{k=0}^nk{n\choose k}p^k(1-p)^{n-k} &= & \sum_{k=\color{blue}{1}}^nn{n-1\choose k-1}p^k(1-p)^{n-k}\\ &= & np\sum_{k=\color{blue}{1}}^n{n-1\choose k-1}p^{k-1}(1-p)^{n-k}\\ &= & np\underbrace{\sum_{k=\color{blue}{0}}^{\color{blue}{n-1}}{n-1\choose k}p^{k}(1-p)^{n-1-k}}_{= (p+(1-p))^{\color{blue}{n-1}}}\\ \end{eqnarray*}

trancelocation
  • 33,349
  • 1
  • 21
  • 46