If we a sample of $n$ values from a given population and if $X$ is the variable of the sample, then the mean of $X$ is just $\dfrac{ \sum x }{n}$
Now, suppose $X$ is random variable. For concreteness, let us take $X$ to be the number of heads in a toss of a coin. Now, $X$ can be 0 or 1. Therefore, the mean in this case is $\dfrac{0+1}{2} = \dfrac{1}{2}$ and the formula coincides with the above, the one for samples.
In general, for random variable $X$, we know the mean is $\sum x P(X=x)$.
But, how is this different from the mean for samples? What is the motivation for this defition?