4

My statistics text states this theorem as if it works for any function $g$:

Let $\tau = g(\theta)$ be a function of $\theta$. Let $\hat{\theta}_n$ be the MLE (Maximum Likelihood Estimator) of $\theta$. Then $\hat{\tau}_n = g(\hat{\theta}_n)$ is the MLE of $g(\theta)$.

And offers this proof that seems to assume $g$ has an inverse:

Proof. Let $h = g^{-1}$ denote the inverse of $g$. Then $\hat{\theta}_n = h(\hat{\tau}_n)$. For any $\tau$, $\mathcal{L}(\tau) = \prod_i f(x_i; h(\tau)) = \prod_i f(x_i;\theta) = \mathcal{L}(\theta)$ where $\theta = h(\tau)$. Hence, for any $\tau$, $\mathcal{L}_n(\tau) = \mathcal{L}(\theta) \leq \mathcal{L}(\hat{\theta}) = \mathcal{L}_n(\hat{\tau})$.

Is an inverse required? Maybe the author is assuming one for a simpler proof? Also I'm not sure where the inequality is coming from?

I tried reading the Wikipedia article on equivariant maps (my statistics text is my first exposure to the term) but it uses too much material I haven't learned yet.

Joseph Garvin
  • 1,218
  • 12
  • 25
  • I think, it doesn't require the inverse. But my memory is hazy(so please verify), usually they define a new likelihood function in terms of old one. – Someone May 31 '19 at 14:41
  • No it does not require the function to be invertible, any function will do. Here is the original paper by Zehna: https://projecteuclid.org/euclid.aoms/1177699475. – StubbornAtom May 31 '19 at 14:43
  • @StubbornAtom But the Math Reviews article about that paper is scathing. – kimchi lover May 31 '19 at 15:01
  • @kimchilover This is among the first works on this topic, hence my mention of the article. I would ask the OP to see the proof on Casella-Berger's Statistical Inference where I think a better insight is provided. – StubbornAtom May 31 '19 at 15:14
  • Is it possible that somebody explain the proof and why it holds? – Amin Jun 11 '20 at 07:11

1 Answers1

3

No it is not necessary that $g$ is invertible. See page 320 of Casella and Berger. The proof of the property is, however, much simpler when $g$ is invertible.