I'm trying to prove convergence in expectation for $r\ge 1$ but I don't know how to do it without relying on the Beta function to solve an integral. (I didn't learn about the Beta function's properties, so I'm not sure if it's the intended way).
My attempt
We need to show $\newcommand{\E}{\mathbb{E}} \newcommand{\P}{\mathbb{P}}Y_n \xrightarrow{L_r}0$ for $r\ge 1$, Where $Y_n = \min\{X_1,\dots, X_n\}$, and $X_i \sim \mathcal{U}[0,1]$ i.i.d from one another.
so what we really need to demonstrate is:
$$\lim_{n \to \infty} \E(|Y_n-0|^r) = 0$$
we'll note that using the definitions of X and Law of the Lazy Statistician: $$\E(|Y_n-0|^r)= \E(|\min\{X_1, \dots, X_n\}|^r) = \E\left[(\min\{X_1, \dots, X_n\})^r\right] = E(Y^r) = \int_0^1 y^r f_{Y_n}(y)dy$$let's find the PDF: $$\begin{aligned}F_{Y_n}(y) &= \P(Y_n \le y) = 1 - \P(Y_n \ge y) = 1-\P(\min\{X_1,\dots,X_n\}\ge y) = 1-\prod_{i=1}^n \P(X_i>y) \\ F_{Y_n}(y)&= 1- (1-y)^n \\ \frac{dF_{Y_n}}{dy} = f_{Y}(y)&=-n \cdot (1-y)^{n-1} \cdot (-1)\end{aligned}$$
Substituting it back in:
$$\E(Y^r) = \int_0^1 y^r \cdot n(1-y)^{n-1}dy = n\int_0^1y^r(1-y)^{n-1}$$
from googling, this integrand's the beta function at $\mathbf{B}(r+1,n)$ times $n$.
Another approach which I tried was switching the order of the limit and the integral, but seeing as its a statistical inference course and to do it I need a theorem I've not explored in any class either it seems to be unintended. Is there a way to do it based on statistical and probability based tools? or circumvent this specific integral entirely?
- 593
-
1Have you tried using characteristic functions already? – julio_es_sui_glace Dec 20 '24 at 15:15
-
this course didn't really focus on them, we did discuss moment generating ones in a previous one though- that's an avenue I haven't explored. I'll look into it! thanks! – kal_elk122 Dec 20 '24 at 15:25
3 Answers
You don't need to use the Beta function, it suffices to use the Monotone Convergence Theorem for non-negative decreasing function by knowing that the functions $f_n(y):= ny^r(1-y)^{n-1}$ is decreasing pointwise and positive, then
$$\lim_{n\infty}\mathbb{E}(Y^r_n) = \lim_{n\infty}\int_0^1f_n(y)dy=\int_0^1\underbrace{\lim_{n\infty}f_n(y)dy}_{\xrightarrow{n\infty}0} =0$$
- 20,162
-
don't we need the functions to also decrease monotonically? and doesn't the n factor in $f_n(y)$ ruin it for us? – kal_elk122 Dec 20 '24 at 16:21
-
1@kal_elk122 We need the functions decrease and converge pointwisely. That means for every $y$, the sequence $(f_n(y))_n$ deacrease at a certain $n(y)$ and converge. – NN2 Dec 20 '24 at 17:19
$\newcommand{\E}{\mathbb{E}} \newcommand{\P}{\mathbb{P}}$We want to compute the integral $$I(r,n):=\E(Y^r) = n\int_0^1y^r(1-y)^{n-1}\,dy= n\int_0^1(1-y)^ry^{n-1}\,dy.$$ Integrating by parts yelds \begin{align}I(r,n)&=-n\left.\frac{(1-y)^{r+1}}{r+1}y^{n-1}\right|_0^1+n(n-1)\int_0^1\frac{(1-y)^{r+1}}{r+1}y^{n-2}\,dy\\&=\frac{n}{r+1}I(r+1,n-1).\end{align} Using this recursion it is easy to get your result.
- 1,596
$\Bbb E[Y_n]={1\over n+1}$, so $\|Y_n\|_1\to 0$ as $n\to\infty$. In particular, $Y_n\to 0$ in probability, so by dominated convergence ($0\le Y_n^r\le 1$) you also have $\Bbb E[Y_n^r]\to 0$ as $n\to\infty$, for each $r>0$.
- 29,845
- 1
- 23
- 39