8

I am studying statistics and I need some guidance as to where this formula came from. All I know is that $\displaystyle E[X^2] = x^2 \sum_{i=0}^n p_{i}(x)$

Person
  • 938

2 Answers2

11

Let $X$ be a real valued random variable, $\mu = E[X]$ its mean, and $\sigma^2 = E[(X-E[X])^2]$ its variance. Then, \begin{align}\sigma^2 & = E[(X-E[X])^2] \\ & = E[X^2 - 2XE[X]+(E[X])^2] \\ & = E[X^2] - E[2XE[X]]+E[(E[X])^2] \\ & = E[X^2] - 2(E[X])^2+(E[X])^2 \\ & = E[X^2] - (E[X])^2 \\ & = E[X^2] - \mu^2.\end{align} This gives us $E[X^2] = \sigma^2+\mu^2$. Which Proves your question.

shaked
  • 103
Lord Soth
  • 7,860
  • 21
  • 37
0

Edit: Thanks to the Did's comment and as a alternative answer.

You can use the following definition:

If $X$ is any random variable with distribution $F_{X}(x)$, then

$$\mu_{X}=\int \limits_{0}^{+\infty}\left({1-F_{X}(x)}\right)dx-\int\limits_{-\infty}^{0}F_{X}(x)dx,$$

$$\sigma_{X}^{2}=\int \limits_{0}^{+\infty} 2x \left( {1-F_{X}(x)+F_{X}(-x) } \right)dx-\mu_{X}^2$$

and then show that

$$E[X^2]=\int \limits_{0}^{+\infty} 2x \left( {1-F_{X}(x)+F_{X}(-x) } \right)dx$$

to conclude your equality.

ILikeMath
  • 1,499