The most statistics I ever took was a few lessons on it back in high school. What always bothered me is how arbitrary the definitions seemed. For instance, I remember having trouble with the definition of standard deviation.
The standard deviation of a set of values $X$ is the square root of the average of the squared of the differences between each value in $X$ and the average of $X$.
At school, standard deviation was never given any more precise definition than "a number that gives you a rough idea how 'diffuse' the dataset is". While I can see in a very approximate way that this is basically correct, it's a long way from how definitions of concepts in math are usually explained. Usually there's a precise notion that we're trying to capture, and a clear explanation as to how our definition captures that notion.
But here, when I asked for further information, I was told things like "you square the differences to make them positive", when what I was hoping for was something like:
- A specific real-world concept that the definition captures.
- A class of problems in which the definition arises naturally.
- A specific mathematical property that we would like to have, which leads necessarily to this particular definition.
Is there any rigorous basis for the definitions in statistics, or do we genuinely just make up formulae that kinda sorta get us something like what we're trying to calculate? Have I just never seen statistics as it really is, or is it actually very different to every other field of mathematics? If the former, can you recommend a book that explains statistics in the way I'd like?