I hope you have realized what the problem is, you only know that $X_{n}$ converges in distribution but do not know to what distribution it converges to . The idea is as follows:-
Maybe that can be done using some parameters which characterizes normal distribution. That is precisely the idea. We already know that the characteristic functions converge to that of $X$. But we need to show that the pointwise limit of the characteristic function is indeed the characteristic function of some normal distribution. To do that , we need to show that the mean and variance of $X_{n}$'s , converge to the mean and variance of $X$.
Firstly $E(|X_{n}-X|^{p})\to 0 \iff E(|X_{n}|^{p})\to E(|X|^{p})$ . This is a standard result of $L^{p}$ spaces and it follows by using the convexity of $x\mapsto x^{p}$ and Fatou's Lemma.
The complete argument is what is done below:-
This means that $E(X_{n}^{2})\to E(X^{2})$
Now the $L^{2}$ norm is stronger than the $L^{1}$ norm for finite measure spaces(ignore this statement if it does not make sense to you now. Just know the fact that the statement is true for probability spaces).
that is $E(|Y|)=E(|Y|\cdot 1)\leq \bigg(E(Y^{2})\bigg)^{\frac{1}{2}}$ due to the Cauchy Schwartz inequality for any random variable $Y$.
So $E|X_{n}-X|^{2}\to 0 \implies E|X_{n}-X|\to 0 $
Also, $|E(X_{n}-X)|\leq E|X_{n}-X|$ . This means that $\lim_{n\to\infty}|E(X_{n}-X)|= 0 $
So $E(X_{n})\to E(X)$ .
This means that $\sigma_{n}^{2}=E(X_{n}^{2})-(E(X_{n}))^{2}\to E(X^{2})-(E(X))^{2}=\sigma^{2}$ .
Thus we have the characteristic function $\psi_{n}(t)=\exp\bigg(iE(X_{n})t-\frac{\sigma_{n}^{2}t^{2}}{2}\bigg)\to \exp\bigg(iE(X)t-\frac{\sigma^{2}t^{2}}{2}\bigg)$ .
But as $X_{n}\to X$ in p-th moment, you correctly point out that $X_{n}\xrightarrow{ P} X$ so $X_{n}\xrightarrow{ d} X$. That means that the characteristic functions of $X_{n}$ must converge to the characteristic function of $X$.
But this means that $\psi_{X}(t)=\exp\bigg(iE(X)t-\frac{\sigma^{2}t^{2}}{2}\bigg)$ . This directly means that $X$ is normally distributed.