3

Given the following sentences:

Let $X_1,..., X_n$ be a random sample from a $Pois(\mu)$ distribution. Consider the following estimator for $e^{-\mu}=P(X_i=0)$: $T=e^{-\overline{X_n}}$.

The independent random variables $X_1;...;X_n$ have a geometric distribution with parameter $p$. Look at the following estimator for $p$: $S=\frac{1}{\overline{X_n}}$.

Prove that the estimators are biased.

In my opinion both estimators are unbiased:
$E[T]=e^{E[\overline{X_n}]}=e^{-\mu}$ that is unbiased for the parameter $e^{-\mu}$.
$E[S]=\frac{1}{E[\overline{X_n}]}=\frac{1}{1/p}=p$ that is unbiased for the parameter $p$.
Why I'm wrong in both cases? Where are my mistakes? Thanks.

TFAE
  • 399
  • 3
    Sorry but if you think that $T=e^{\bar X}$ implies $E(T)=e^{E(\bar X)}$ and that $S=\frac1{\bar X}$ implies $E(S)=\frac1{E(\bar X)}$, then you have some serious revising to do... FYI, if $Z>0$ almost surely then $E\left(\frac1Z\right)=\frac1{E(Z)}$ never happens, except if $Z$ is constant. – Did Jan 13 '19 at 17:34
  • 1
    Mistake is concluding $E(g(X))=g(E(X))$ for arbitrary function $g$. You might want to take a look at Jensen's inequality. – StubbornAtom Jan 13 '19 at 17:50
  • You are right, thanks! :) – TFAE Jan 13 '19 at 20:27
  • This is odd, it seems the accepted answer is addressing none of your two questions. Please explain. – Did Jan 13 '19 at 20:33

3 Answers3

4

You can't write$$E\left\{e^{\overline{X_n}}\right\}=e^{E\{\overline{X_n}\}}$$but we have $$E\left\{e^{\overline{X_n}}\right\}{=E\left\{e^{X_1\over n}e^{X_2\over n}\cdots e^{X_n\over n}\right\}\\=\left(E\left\{e^{X_1\over n}\right\}\right)^n\\=(\exp(\mu(e^{1\over n}-1)))^n\\=\exp\Big(\mu n(\sqrt[n]e-1)\Big)}$$according to Poisson distribution which is biased.

As with the first one, we can argue similarly for the second one as following$$E\{S\}{=E\left\{{n\over X_1+\cdots + X_n}\right\}\\=nE\left\{{1\over X_1+\cdots + X_n}\right\}\\=nE\left\{{1\over Y}\right\}}$$where $Y$ has negative binomial distribution with parameters $p $ and $n$. Further calculations are nasty in this case but you can refer to Geometric distribution section Parameter estimation.

Mostafa Ayaz
  • 33,056
  • Thanks for the answer, what about for the second example $T=\frac{#{i:X_i=1}}{n}$. Where $#A$ stands for the number of elements in $A$, why is unbiased, is not $E[S]=\frac{1}{E[X_n]}=\frac{1}{1/p}=p$? – TFAE Jan 13 '19 at 20:26
  • 1
    You're welcome. Sorrily. I can't read what you've written but if I get you correctly, this is equivalent to say $$E\left{{1\over Y}\right}={1\over E{{Y}}}$$or in integral form $$\int {1\over y}f(y)dy={1\over \int yf(y)dy}$$which doesn't hold generally. You can also refer to Cauchy Schwartz inequality for the cases it holds – Mostafa Ayaz Jan 13 '19 at 20:30
  • Sorry, I fixed my question :) – TFAE Jan 13 '19 at 20:31
  • 1
    Another way of saying that is $$\int {1\over y}f(y)dy\int yf(y)dy= {1}$$ – Mostafa Ayaz Jan 13 '19 at 20:31
  • 1
    Nice question whatsoever (+1) – Mostafa Ayaz Jan 13 '19 at 20:32
  • Thanks a lot for the help, have a nice day! – TFAE Jan 13 '19 at 20:33
  • 1
    Thanks! You too ...... – Mostafa Ayaz Jan 13 '19 at 20:33
  • Sorry Mostafa, promise this is the last.. How can I prove that $T=\frac{#{x_i=0}}{n}$ is un unbiased estimator for $e^{-\mu}$ for a $Pois(\mu)$? thanks again. – TFAE Jan 13 '19 at 20:58
  • Is not $E[T]=\frac{n \cdot \mu}{n}=\mu$? – TFAE Jan 13 '19 at 20:59
  • 1
    The answer is negative also to the same reason of $E{S}$. The only thing we can say in the most general case is that $$E{T}=E{e^{-X_1\over n}}E{e^{-X_2\over n}}\cdots E{e^{-X_n\over n}}$$because of the independence between $X_i$s and hence $S_i\over n$s .Then after, we need math to continue. – Mostafa Ayaz Jan 13 '19 at 21:04
3

The problem is that the exponential and the reciprocal aren't linear. You can't get an unbiased estimator for the standard deviation by taking the square root of an unbiased estimator for the variance, and we have the same problem here.

In order to see these examples more clearly, consider one-element samples. In the Poisson example, we get a probability of $e^{-\mu}$ of sample value $0$ and estimate $1$, a probability of $\mu e^{-\mu}$ of sample value $1$ and estimate $e$, a probability of $\frac{\mu^2}{2}e^{mu}$ of sample value $2$ and estimate $e^2$, and so on. Add them up, and the expected value of what we get is a value of the moment-generating function $\sum_{n=0}^{\infty} \frac{\mu^n e^n}{n!}e^{-\mu} = e^{\mu(e-1)}$. That's not the mean we wanted, and there's no simple way to correct it.

The second example, on the geometric distribution, has similar issues. The probability of getting $n$ is $(1-p)\cdot p^n$, so our one-element estimate returns $\frac1n$ with that probability. Sum that, and we get $-(1-p)\ln(1-p)$. That's definitely not what we want - it goes to zero as $p\to 1^-$.

jmerry
  • 19,943
  • Thanks for the answer, what about for the second example $T=\frac{#{i:X_i=1}}{n}$. Where $#A$ stands for the number of elements in $A$, why is unbiased, is not $E[S]=\frac{1}{E[X_n]}=\frac{1}{1/p}=p$? – TFAE Jan 13 '19 at 20:27
  • 1
    Comments to previous answers are not a good place to ask new questions. Asking about a new estimator not even hinted at in your original question - that doesn't belong here. – jmerry Jan 13 '19 at 20:37
3

Here $n < +\infty$. We compute $\Bbb E (\exp(\bar X_n))$ where $\bar X_n$ is the mean of a i.i.d. sample $(X_1,\dots,X_n)$ with distribution $\text{Poisson}(\mu)$: \begin{aligned} \Bbb E (\exp(\bar X_n)) &= \Bbb E \left(\prod_{i=1}^n \exp \frac{X_i}{n}\right) \\ &= \prod_{i=1}^n \Bbb E \left( \exp \frac{X_i}{n}\right) \\ &= \Bbb E \left( \exp \frac{X_1}{n}\right)^n \\ &= \left(e^{-\mu}\sum_{k=0}^\infty \frac{(e^{1/n} \mu)^k}{k!} \right)^n \\ &= \exp\left(n (e^{1/n}-1)\mu\right) & \neq \exp (\mu) \end{aligned} One can note that the bias vanishes as $n\to +\infty$. For the second example, one can prove that the estimator is biased by using Jensen's inequality (see this post).

EditPiAf
  • 21,328