0

I'm trying to compute the density for the range $R_n$ for samples of a random variable $X$ that are uniformly distributed on the interval $(a,b)$. We define the range as

$$ R_n = X_{(n)} - X_{(1)}, $$

where $X_{(i)}$ is the $i$-th smallest sample of $X$. The theoretical density (which I got from Allan Gut's Intermediate Probability) is

$$ f_{R_n}(r) = n (n - 1) \int_{-\infty}^\infty (F_X(x + r) - F_X(x))^{n-2} f_X(x + r)f_X(u) \;\text{d}u $$

and substituting $F$'s and $f$'s yields

$$ = n (n - 1) \int_a^{b - r} \left(\frac{r - 2a}{b - a}\right)^{n-2}\left(\frac{1}{b - a}\right)^2 \;\text{d}u $$

which finally simplifies to

$$ = n (n - 1) \left(\frac{r - 2a}{b - a}\right)^{n - 2}\left(\frac{1}{b - a}\right)^2 (b - a - r) $$

Now, this seems reasonable, but I sort of guessed at the limits of integration. Does this look right?


Edit in response to closing as duplicate:

As far as I can tell, this question is different from the one posted because it is more general. I am quantifying over uniform distributions on arbitrary intervals. It is also more specific, since I am confused about the domain on which I am integrating.

Math1000
  • 38,041
nomen
  • 2,846

1 Answers1

1

I think is OK. The density of the Uniform$(a,b)$ is given by $$f_X(u) = \frac{1}{b-a} 1_{(a,b)}(u)$$ so the expression $$f_X(u+r)f_X(u) = \frac{1}{(b-a)^2} 1_{(a,b)}(u+r) 1_{(a,b)}(u)$$ is equivalent to $$ \frac{1}{(b-a)^2} 1_{(a-r,b-r)}(u) 1_{(a,b)}(u) = \frac{1}{(b-a)^2} 1_{(a,b-r)}(u)$$ provided $r < b-a$.

Bunder
  • 2,493