2

As I understand it, variance of a random variable is defined as follows: \begin{equation} \text{Var}(X) = \text{E}[(X-\mu)^2] \end{equation} $X-\mu$ is obviously the difference between the value of the random variable and the expected value of the random variable.

Why don't we define variance as \begin{equation} \text{Var}(X) = \text{E}[X-\mu] \end{equation}

To me, this would make more sense because it would express the difference between the expected value and the value it takes...which to me sounds like variation.

Why do we use the $(X-\mu)^2$ instead?

Stan Shunpike
  • 5,293
  • 4
  • 43
  • 81

0 Answers0