I'm trying to compute the density for the range $R_n$ for samples of a random variable $X$ that are uniformly distributed on the interval $(a,b)$. We define the range as
$$ R_n = X_{(n)} - X_{(1)}, $$
where $X_{(i)}$ is the $i$-th smallest sample of $X$. The theoretical density (which I got from Allan Gut's Intermediate Probability) is
$$ f_{R_n}(r) = n (n - 1) \int_{-\infty}^\infty (F_X(x + r) - F_X(x))^{n-2} f_X(x + r)f_X(u) \;\text{d}u $$
and substituting $F$'s and $f$'s yields
$$ = n (n - 1) \int_a^{b - r} \left(\frac{r - 2a}{b - a}\right)^{n-2}\left(\frac{1}{b - a}\right)^2 \;\text{d}u $$
which finally simplifies to
$$ = n (n - 1) \left(\frac{r - 2a}{b - a}\right)^{n - 2}\left(\frac{1}{b - a}\right)^2 (b - a - r) $$
Now, this seems reasonable, but I sort of guessed at the limits of integration. Does this look right?
Edit in response to closing as duplicate:
As far as I can tell, this question is different from the one posted because it is more general. I am quantifying over uniform distributions on arbitrary intervals. It is also more specific, since I am confused about the domain on which I am integrating.