4

It is not the first time meeting this problem in StackExchange and I have read the answer to it(the original solution is copied at the bottom, also available in Show a statistic is complete but not suffcient, the idea is checking the completeness by definition).

But it seems that he wrongly uses the inverse property of the two-sided Laplace transformation because now the transformation equality does not hold for every real value(not for $0$).

So what should I do now because Laplace transformation is a direct and powerful tool to solve this kind of problem(like proving completeness in the general exponential family) but it does not work here, Are there any related properties that can fix it? Or other ideas?

Let $X_{1}, \ldots, X_{n},(n \geq 2)$ be i.i.d. random variables having the normal distribution $N(\theta, 2)$ when $\theta=0$ and the normal distribution $N(\theta, 1)$ when $\theta \in \mathbb{R}-\{0\}$. Show that the sample mean $\bar{X}$ is a complete statistic(sorry for missing this one before, this is the main concern of the question) but not a sufficient statistic for $\theta$.

$$ \begin{aligned} \mathrm{E}(g(\bar{X})) &=\int_{\mathbb{R}} g(u) f_{\bar{X}}(u) d u=\int_{\mathbb{R}} g(u) \frac{1}{\sqrt{2 \pi}} \exp \left(\frac{-1}{2} \cdot \frac{(u-\theta)^{2}}{2 / n}\right) d u \\ &=\frac{1}{\sqrt{2 \pi}} \int_{\mathbb{R}} g(u) \exp \left(\frac{-1}{2} \cdot \frac{u^{2}-2 u \theta+\theta^{2}}{2 / n}\right) d u \\ &=\frac{1}{\sqrt{2 \pi}} \cdot \exp \left(\frac{-n \theta^{2}}{4}\right) \int_{\mathbb{R}} \overbrace{\left(g(u) \exp \left(\frac{-n u^{2}}{4}\right)\right)}^{\text {Call this } h(u)} \exp \left(\left(\frac{n \theta}{2}\right) u\right) d u \end{aligned} $$ (The factor that does not depend on $u$ has been pulled out.) $$ =0 \text { only if } \int_{\mathbb{R}} h(u) \exp (\eta u) d u=0 $$ i.e. $\quad(\mathcal{L} h)(\eta)=0$ for every value of $\eta$, where $\mathcal{L}$ is a two-sided Laplace transform. Recall the inverse property: if $\quad(\mathcal{L} f)(s)=\quad(\mathcal{L} g)(s)$ for every value of $s$ in $\mathbb{R}$, then $f = g$ almost everywhere. Hence we conclude that $h = 0, a.e$ and therefore $g = 0, a.e$, which complete the proof of completeness.

  • 1
    You can simply copy-paste the link of the answer used in your post, saying this is where the solution is from. For more details, see https://math.stackexchange.com/editing-help. – StubbornAtom Oct 31 '21 at 10:11
  • Thanks for your reply, I'm very happy to read it. – Small_shizi Oct 31 '21 at 11:02

1 Answers1

1

It is not very clear from your posting what the argument of used is.

One way to solve this problem is to show that for $T_n(x_1,\ldots,x_n):=\frac1n\sum^n_{k=1}x_k$, there is a bounded measurable function $g(x_1,\ldots,x_n)$, $E_\theta[g(X_1,\ldots,X_n)|\sigma(T_n)]$ depends on the parameter $\theta$, for if $T_n$ were a sifficient statistic, then $E_\theta[g(X_1,\ldots, X_n)|\sigma(T_n)]$ would be independent of $\theta$.

Suggestion: One may start by defining $\sigma(\theta)=1+\mathbb{1}(\theta=0)$, and considering the change of variables \begin{align} \Phi(x_1,\ldots,x_n)=\begin{pmatrix} x_1\\ \vdots\\x_{n-1}\\x_1+\ldots +x_n\end{pmatrix} \end{align} The Jacobian determinant of $\Phi$ is $J_\Phi(x_1,\ldots,x_n)=1$. Hence, for any measurable function $\phi:\mathbb{R}\rightarrow\mathbb{R}$ $$\begin{aligned} &(2\pi\sigma(\theta))^{-n/2} \int_{\mathbb{R}^n}g(\mathbf{x})\phi(T(\mathbf{x}))e^{-\sum^n_{k=1}\tfrac{(x_k - \theta)^2}{2\sigma(\theta)}}\,d\mathbf{x}\\ &= (2\pi\sigma(\theta))^{-n/2}e^{-\tfrac{n\theta^2}{2\sigma(\theta)}}\int_{\mathbb{R}^n}g(\Phi^{-1}(\mathbf{y}))\phi(\tfrac{y_n}{n})e^{\tfrac{\theta y_n}{\sigma(\theta)}}e^{-\tfrac{y^2_1+\ldots +y^2_{n-1}+(y_n-y_1-\ldots-y_{n-1})^2}{2\sigma(\theta)}}\,d\mathbf{y}\\ &=(2\pi\sigma(\theta))^{-n/2}e^{-\tfrac{n\theta^2}{2\sigma(\theta)}}\int_{\mathbb{R}} e^{\tfrac{\theta y_n}{\sigma(\theta)}} \phi(\tfrac{y_n}{n}) e^{-\tfrac{y^2_n}{2\sigma(\theta)}} H(y_n)\,dy_n \end{aligned} $$ where $$ \begin{align} H(y_n)= \int_{\mathbb{R}^{n-1}}g(y_1,\ldots,y_{n-1},y_n-nT_{n-1}(y_1,\ldots,y_{n-1}))e^{S(y_1,\ldots, y_{n-1},y_n)}\,dy_1,\ldots dy_{n-1} \end{align} $$ and $$\begin{align} S(y_1,\ldots,y_n)=-\frac{y^2_1+\ldots +y^2_{n-1}-y_nT_{n-1}(y_1,\ldots,y_{n-1}) +T^2_{n-1}(y_1,\ldots,y_{n-1})}{2\sigma(\theta)} \end{align} $$

From this, one should be able to find a form for $E_\theta[g(X_1,\ldots,X_n)|\sigma(T_n)]$.

Hope this helps

Mittens
  • 46,352
  • Sorry that I use the wrong version of this exercise, it also needs to prove the completeness of $\overline{X}$, which is actually really concerning here. Thank you very much for your answer about sufficiency and apologize for my mistake sincerely, I was too in a hurry to cause some problems in the text, and even the problem appeared. And the posting arguments are trying to check completeness by its definition. – Small_shizi Oct 31 '21 at 03:50