2

I'm having a difficult time figuring out where to go here.

Question: Let $Y_1,\dots, Y_n$ be iid random variables with pdf
$f_Y(y) = \theta e^{-\theta y} \;,\; y >0\;,\;\theta >0.$
Show that the uniformly minimal variance unbiased estimator (UMVUE) of $\theta$ is given by $ \frac{n-1}{\sum_{i=1}^n Y_i}.$

We were given the following theorem in class:
Theorem: If we have $Y1,\ldots,Yn$ iid random variables and Y belongs to an exponential family (with a single parameter θ), then, under some technical conditions, we say that $ U = \sum_{i=1}^n t(Y_i)$ is a *complete and sufficient statistic for $\theta$.

My work: Our pdf belongs to an exponential family with $h(y) = 1$, $c(\theta) = \theta$, $w(\theta) = -\theta$, and $t(y) = y$. Therefore, $U = \sum_{i=1}^n Y_i$ is a complete and sufficient statistic for $\theta$. Now, to show that it is the UMVUE of $\theta$, I need to show that it is unbiased.

$B(\hat{\theta}) = E[\hat{\theta}] - \theta$.
We can rewrite our pdf to be $\frac{1}{\frac{1}{\theta}}e^{\frac{-y}{\frac{1}{\theta}}}$, which is the exponential with $\beta = \frac{1}{\theta}$, so $E[Y] = \frac{1}{\theta}$. Therefore

$\begin{align} E[\hat{\theta}] &= E[\sum_{i=1}^n Y_i] = \sum_{i=1}^n E[Y_i] = nE[Y] \notag \\ &nE[Y] = \frac{n}{\theta} \notag \\ &B(\hat{\theta}) = \frac{n}{\theta} - \theta \notag \end{align}$ which will be unbiased if $\hat{\theta} = \frac{\theta^2}{n}E[nY] = Var(\bar{Y})E[nY]$. At this point I feel like I've sort of lost the plot and am no longer sure what to do.

RobPratt
  • 50,938
Calum
  • 399

1 Answers1

1

The joint density of your distribution is $$ f(\mathbf y) \propto \exp \left(-\theta \sum_{i=1}^n Y_i + n\log(\theta) \right) $$ so $T = \sum_{i=1}^n Y_i$ is indeed a complete and sufficient statistic for $\theta$.

For UMVUEs, we always proceed through Lehmann-Scheffe, which states that if you can find an unbiased function of a complete sufficient statistic, it is automatically the UMVUE.

In our case, clearly $$\frac{n-1}{\sum_{i=1}^n Y_i} = \frac{n-1}{T}$$ is a function of $T$, so if it has expectation $\theta$, we are done.

In particular, you can check that $\sum_{i=1}^n Y_i \sim \operatorname{Gamma}(n, \theta)$ (see this math.se link for example), in which case \begin{align*} \mathbb E\left[\frac{n-1}{\sum_{i=1}^n Y_i}\right] &= \int_0^\infty \frac{n-1}{y} \frac{\theta^{n}}{\Gamma(n)}y^{n-1}e^{-\theta y}dy \\ &= \frac{\theta^n}{\Gamma(n-1)} \int_0^\infty y^{n-2}e^{-\theta y}dy \ \ \ \ \text{since } \Gamma(n) = (n-1)\Gamma(n-1) \\ &= \frac{\theta}{\Gamma(n-1)} \int_0^\infty t^{n-2}e^{-t}dt \ \ \ \ \text{after the substitution } t = \theta y \\ &= \frac{\theta}{\Gamma(n-1)}\Gamma(n-1) \ \ \ \ \text{by def. of the $\Gamma$ function} \\ &= \theta \end{align*} so we conclude that $\frac{n-1}{\sum_{i=1}^n Y_i}$ is the UMVUE by Lehmann-Scheffe.

daisies
  • 2,063
  • 2
  • 7