I am studying statistics and I need some guidance as to where this formula came from. All I know is that $\displaystyle E[X^2] = x^2 \sum_{i=0}^n p_{i}(x)$
-
The definition of $\sigma^2$ (namely $\sigma^2 = E[(X - \mu)^2]$) boils down to $\sigma^2 = E[X^2] - \mu^2$. – Daniel Fischer Aug 06 '13 at 18:32
-
Thanks Daniel, I see it now. I should really study those identities. – Person Aug 06 '13 at 18:35
-
There is a different way to look at the proof https://math.stackexchange.com/a/3618238/1108681 – shaked Dec 06 '22 at 00:11
2 Answers
Let $X$ be a real valued random variable, $\mu = E[X]$ its mean, and $\sigma^2 = E[(X-E[X])^2]$ its variance. Then, \begin{align}\sigma^2 & = E[(X-E[X])^2] \\ & = E[X^2 - 2XE[X]+(E[X])^2] \\ & = E[X^2] - E[2XE[X]]+E[(E[X])^2] \\ & = E[X^2] - 2(E[X])^2+(E[X])^2 \\ & = E[X^2] - (E[X])^2 \\ & = E[X^2] - \mu^2.\end{align} This gives us $E[X^2] = \sigma^2+\mu^2$. Which Proves your question.
Edit: Thanks to the Did's comment and as a alternative answer.
You can use the following definition:
If $X$ is any random variable with distribution $F_{X}(x)$, then
$$\mu_{X}=\int \limits_{0}^{+\infty}\left({1-F_{X}(x)}\right)dx-\int\limits_{-\infty}^{0}F_{X}(x)dx,$$
$$\sigma_{X}^{2}=\int \limits_{0}^{+\infty} 2x \left( {1-F_{X}(x)+F_{X}(-x) } \right)dx-\mu_{X}^2$$
and then show that
$$E[X^2]=\int \limits_{0}^{+\infty} 2x \left( {1-F_{X}(x)+F_{X}(-x) } \right)dx$$
to conclude your equality.
- 1,499
-
1
-
The question seems like for discrete random variable. However I tried improve my answer. – ILikeMath Aug 06 '13 at 20:45