Why we define the variance of a random variable $X$ as $\text{var}[X]=\text{E}[(X-\mu)^2]$ instead of $\text{var}[X]=\text{E}[\left|X-\mu\right|]$.
Normally we understand the standard deviation $\sigma=\sqrt{\text{var}[X]}$ as a measure of the average distance of a sample from the mean. If this is the case, isn't it more reasonable to use the absolute value instead of squaring as a measure of distance? Then this way we don't have to square afterwards to obtain $\sigma$.