I'm having a difficult time figuring out where to go here.
Question: Let $Y_1,\dots, Y_n$ be iid random variables with pdf
$f_Y(y) = \theta e^{-\theta y} \;,\; y >0\;,\;\theta >0.$
Show that the uniformly minimal variance unbiased estimator (UMVUE) of $\theta$ is given by $
\frac{n-1}{\sum_{i=1}^n Y_i}.$
We were given the following theorem in class:
Theorem: If we have $Y1,\ldots,Yn$ iid random variables and Y belongs to an exponential family (with a single parameter θ), then, under some technical conditions, we say that $
U = \sum_{i=1}^n t(Y_i)$ is a *complete and sufficient statistic for $\theta$.
My work: Our pdf belongs to an exponential family with $h(y) = 1$, $c(\theta) = \theta$, $w(\theta) = -\theta$, and $t(y) = y$. Therefore, $U = \sum_{i=1}^n Y_i$ is a complete and sufficient statistic for $\theta$. Now, to show that it is the UMVUE of $\theta$, I need to show that it is unbiased.
$B(\hat{\theta}) = E[\hat{\theta}] - \theta$.
We can rewrite our pdf to be $\frac{1}{\frac{1}{\theta}}e^{\frac{-y}{\frac{1}{\theta}}}$, which is the exponential with $\beta = \frac{1}{\theta}$, so $E[Y] = \frac{1}{\theta}$. Therefore
$\begin{align} E[\hat{\theta}] &= E[\sum_{i=1}^n Y_i] = \sum_{i=1}^n E[Y_i] = nE[Y] \notag \\ &nE[Y] = \frac{n}{\theta} \notag \\ &B(\hat{\theta}) = \frac{n}{\theta} - \theta \notag \end{align}$ which will be unbiased if $\hat{\theta} = \frac{\theta^2}{n}E[nY] = Var(\bar{Y})E[nY]$. At this point I feel like I've sort of lost the plot and am no longer sure what to do.