3

I am a bit confused with this exercise, since I never worked with samples of this type. I would appreciate if you can help me. The exercise is as follows:

Let $\{Xi\} \sim N(iθ, 1)$ for $i = 1, .... , n$ be an independent, but not identically distributed sample. Check that $T = \sum_iX_i$ it is a sufficient statistic for $θ$.

What I need is to verify that the $T$ statistic is sufficient for the $\theta$ parameter. I know how to do it with the Fisher and Neymann factorization theorem, but always with a identically distributed sample of random variables. In this case, the sample is not identically distributed. Therefore, I don't know how to verify it.

Arctic Char
  • 16,972
John
  • 31
  • 1
  • Please clarify your specific problem or provide additional details to highlight exactly what you need. As it's currently written, it's hard to tell exactly what you're asking. – Community Sep 07 '21 at 15:06
  • What I need is to verify that the T statistic is sufficient for the theta parameter. I know how to do it with the Fisher and Neymann factorization theorem, but always with a identically distributed sample of random variables. In this case, the sample is not identically distributed. Therefore, I don't know how to verify it. That's the problem. Thanks! – John Sep 07 '21 at 15:23
  • This time , I agree Community's comment. – Peter Sep 07 '21 at 15:31
  • Essentially you want to use the factorisation theorem by finding the density or likelihood associated with a set of observations and then tidy it up with the aim of factorising. Without doing the calculations, I would guess $\sum_i X_i$ may not be sufficient but $\sum_i iX_i$ might be, though this may be completely wrong – Henry Sep 07 '21 at 16:55
  • Independence is enough to find the joint density of $X_1,X_2,\ldots,X_n$; you don't need the sample to be identically distributed. – StubbornAtom Sep 07 '21 at 18:37
  • The explanation of the problem is clearer than water. The question is very simple. I need to verify that a statistic is sufficient for a parameter. It does not have much mystery. I don't know what else I can explain to you people. If they don't know the subject, no problem. But do not tell me I lack details in the explanation, when it is very clear. Thank you for your answers. I consider the issue resolved. – John Sep 07 '21 at 23:17

1 Answers1

2

We have that the probability density function of $X_i$ is $$f_{X_i}(x_i) = \dfrac{1}{\sqrt{2\pi}}\exp\left[ -\dfrac{1}{2}\left(x_i-i\theta \right)^2\right]\text{.}$$ By independence, the joint density is given by $$f_{X_1, \dots, X_n}(x_1, \dots, x_n) = \prod_{i=1}^{n}f_{X_i}(x_i) = \dfrac{1}{(2\pi)^{n/2}}\exp\left[-\dfrac{1}{2}\sum_{i=1}^{n}(x_i - i\theta)^2 \right]\text{.}$$ Now we expand the sum: $$\sum_{i=1}^{n}(x_i - i\theta)^2 = \sum_{i=1}^{n}x_i^2 - 2\theta\sum_{i=1}^{n}ix_i + \theta\sum_{i=1}^{n}i^2\text{.}$$ Recalling the sum $$\sum_{i=1}^{n}i^2 = \dfrac{n(n+1)(2n+1)}{6}$$ we obtain $$\sum_{i=1}^{n}(x_i - i\theta)^2 = \sum_{i=1}^{n}x_i^2+\theta\left[\dfrac{n(n+1)(2n+1)}{6}- 2\sum_{i=1}^{n}ix_i\right]\text{.}$$ The joint density may thus be written as $$\underbrace{\dfrac{1}{(2\pi)^{n/2}}\exp\left[-\dfrac{1}{2} \sum_{i=1}^{n}x_i^2\right]}_{h(\mathbf{x})}\underbrace{\exp\left\{ \theta\left[\dfrac{n(n+1)(2n+1)}{6}- 2\sum_{i=1}^{n}ix_i\right]\right\}}_{\varphi(\sum_{i=1}^{n}ix_i, \theta)}\text{.}$$ By the factorization criterion, $\sum_{i=1}^{n}iX_i$ is sufficient for $\theta$.

Unless there's something I'm missing here, I don't think $\sum_{i=1}^{n}X_i$ is sufficient for $\theta$.

Clarinetist
  • 20,278
  • 10
  • 72
  • 137