In Mathematical Statistics written by Jun Shao(2003), exercise 3.22 claims that
Exercise 3.22. Let $\left(X_{1}, \ldots, X_{n}\right)$ be a random sample from $P \in \mathcal{P}$ containing all symmetric distributions with finite means and with Lebesgue densities on $\mathcal{R}$.
- (i) When $n=1$, show that $X_{1}$ is the UMVUE of $\mu=E X_{1}$.
- (ii) When $n>1$, show that there is no UMVUE of $\mu=E X_{1}$.
However, it seems that example 3.8 gives a counterexample towards 3.22.(i) when $g(\theta)=\theta$. In the solution to exercise 3.22, a uniform distribution $U(\theta_1-\theta_2,\theta_1+\theta_2),\theta_1\in \mathbb{R},\theta_2>0$ is considered as a subfamily of $\mathcal{P}$. So I think $U\left(\theta-\frac{1}{2}, \theta+\frac{1}{2}\right))$ also belongs to $\mathcal{P}$, which makes a contradiction in exercise 3.22(i). Dose anyone can answer why this happens?
Example 3.8. Let $X$ be a sample (of size 1) from the uniform distribution $U\left(\theta-\frac{1}{2}, \theta+\frac{1}{2}\right), \theta \in \mathcal{R}$. We now apply Theorem $3.2$ to show that there is no UMVUE of $\vartheta=g(\theta)$ for any nonconstant function $g$. Note that an unbiased estimator $U(X)$ of 0 must satisfy $$ \int_{\theta-\frac{1}{2}}^{\theta+\frac{1}{2}} U(x) d x=0 \quad \text { for all } \theta \in \mathcal{R} \text {. } $$ Differentiating both sizes of the previous equation and applying the result of differentiation of an integral lead to $U(x)=U(x+1)$ a.e. $m$, where $m$ is the Lebesgue measure on $\mathcal{R}$. If $T$ is a UMVUE of $g(\theta)$, then $T(X) U(X)$ is unbiased for 0 and, hence, $T(x) U(x)=T(x+1) U(x+1)$ a.e. $m$, where $U(X)$ is any unbiased estimator of 0 . Since this is true for all $U, T(x)=T(x+1)$ a.e. $m$. Since $T$ is unbiased for $g(\theta)$, $$ g(\theta)=\int_{\theta-\frac{1}{2}}^{\theta+\frac{1}{2}} T(x) d x \quad \text { for all } \theta \in \mathcal{R} . $$ Differentiating both sizes of the previous equation and applying the result of differentiation of an integral, we obtain that $$ g^{\prime}(\theta)=T\left(\theta+\frac{1}{2}\right)-T\left(\theta-\frac{1}{2}\right)=0 \quad \text { a.e. } m . $$