3

Hey I have some problems prooving this statement:

Let $(X_n)_{n\in\mathbb N}$ be a sequence of normally distributed random variables on $(\mathbb{R}, B(\mathbb{R}))$. Let $X$ be another random variable on $(\mathbb{R}, B(\mathbb{R}))$ with $E[X^2] < \infty$.

Then it follows from $E[(X_n − X)^2] \rightarrow 0$ that $X$ is also normally distributed.

What I have thought is that:

if $E[(X_n − X)^2] \rightarrow 0$ then $X_n −X\overset{P}{\rightarrow} 0$

So maybe I can do something with Markov's inequality but idk what. I am stuck.

Can someone help me?

ADDED: IDK if it is stupid but: $P[|X_n-X|\geq z]\leq \frac{E[(X_n − X)^2]}{z^2}=\frac{E[X_n^2]-2E[X_nX]+E[X^2]}{z^2}$ since $z$ is a costant than $E[X_n^2]-2E[X_nX]+E[X^2]$ must converge to zero.

MarcoDJ01
  • 718
  • In the exercise there is no other information. I also thought something was missing. – MarcoDJ01 Oct 11 '22 at 15:09
  • Maybe use: (convergence in quadratic mean) implies (convergence in probability) implies (convergence in distribution) implies (convergence in characteristic function). See: https://math.stackexchange.com/questions/2049232/does-convergence-in-distribution-implies-convergence-of-characteristic-functions – yurnero Oct 11 '22 at 15:12
  • 1
    If $X_n \to X$ in probability, it also converges in distribution, but if all the $X_n$ have the same distribution, so does $X$ (it is a constant sequence). Also, this does not require independence, which seems like a noisy assumption. – William M. Oct 11 '22 at 15:16
  • @Vincent nowhere in the problem is stated that the random variables are independent; also, the $X_n$ are no "abstract random variables (whatever this means) but normally distributed random variables. – William M. Oct 11 '22 at 15:20
  • Is there any assumption on the mean of $X_{n}$'s ? – Mr. Gandalf Sauron Oct 11 '22 at 15:38

1 Answers1

2

I hope you have realized what the problem is, you only know that $X_{n}$ converges in distribution but do not know to what distribution it converges to . The idea is as follows:-

Maybe that can be done using some parameters which characterizes normal distribution. That is precisely the idea. We already know that the characteristic functions converge to that of $X$. But we need to show that the pointwise limit of the characteristic function is indeed the characteristic function of some normal distribution. To do that , we need to show that the mean and variance of $X_{n}$'s , converge to the mean and variance of $X$.

Firstly $E(|X_{n}-X|^{p})\to 0 \iff E(|X_{n}|^{p})\to E(|X|^{p})$ . This is a standard result of $L^{p}$ spaces and it follows by using the convexity of $x\mapsto x^{p}$ and Fatou's Lemma.

The complete argument is what is done below:-

This means that $E(X_{n}^{2})\to E(X^{2})$

Now the $L^{2}$ norm is stronger than the $L^{1}$ norm for finite measure spaces(ignore this statement if it does not make sense to you now. Just know the fact that the statement is true for probability spaces).

that is $E(|Y|)=E(|Y|\cdot 1)\leq \bigg(E(Y^{2})\bigg)^{\frac{1}{2}}$ due to the Cauchy Schwartz inequality for any random variable $Y$.

So $E|X_{n}-X|^{2}\to 0 \implies E|X_{n}-X|\to 0 $

Also, $|E(X_{n}-X)|\leq E|X_{n}-X|$ . This means that $\lim_{n\to\infty}|E(X_{n}-X)|= 0 $ So $E(X_{n})\to E(X)$ .

This means that $\sigma_{n}^{2}=E(X_{n}^{2})-(E(X_{n}))^{2}\to E(X^{2})-(E(X))^{2}=\sigma^{2}$ .

Thus we have the characteristic function $\psi_{n}(t)=\exp\bigg(iE(X_{n})t-\frac{\sigma_{n}^{2}t^{2}}{2}\bigg)\to \exp\bigg(iE(X)t-\frac{\sigma^{2}t^{2}}{2}\bigg)$ .

But as $X_{n}\to X$ in p-th moment, you correctly point out that $X_{n}\xrightarrow{ P} X$ so $X_{n}\xrightarrow{ d} X$. That means that the characteristic functions of $X_{n}$ must converge to the characteristic function of $X$.

But this means that $\psi_{X}(t)=\exp\bigg(iE(X)t-\frac{\sigma^{2}t^{2}}{2}\bigg)$ . This directly means that $X$ is normally distributed.