0

I'm having issues understanding how to approach this question. Let $X_1, X_2, ... X_n$ be random variables in $(0,1)$ over some distribution.

Prove that the following are equivalent.

  1. $\forall \epsilon, \delta \in (0,1), \exists n_0 \in N$ such that $\forall n\ge n_0, P[X_n > \epsilon] < \delta$

  2. $\lim_{n\to\infty} E(X_n) = 0$

How can I even relate $P[X_n > \epsilon]$ to $E(X_n)$ if I can't assume anything about how the $X_n$s are distributed. Anything suggestions will be appreciated.

2 Answers2

1

You can say that $$ E[X_{n}] = E[X_{n} \cdot I_{\{X_{n}> \varepsilon\}}] + E[X_{n} \cdot I_{\{X_{n} \leq \varepsilon\}}] $$ where $I_{\{X_{n}> \varepsilon\}}$ and $I_{\{X_{n} \leq \varepsilon\}}$ are indicator functions.

Note then that $$ E[X_{n}] \leq E[1 \cdot I_{\{X_{n} > \varepsilon\}}] + E[\varepsilon \cdot I_{\{X_{n} \leq \varepsilon\}}] $$ and that the expected value of an indicator is a probability.

Rubarb
  • 585
1

Basically this equivalence holds true, because in this case convergence in probability (statement 1) is induced by a metric, as previously shown here.

In particular this shows that (since in your case $1 \geq X_i \geq 0$):

$$ X_i \overset{p}{\to} 0 \Leftrightarrow E\left[\frac{X_i}{1+X_i}\right] \to 0$$

Now the left hand site is equivalent to your statement (1) by definition of convergence in probability, while the right hand side is equivalent to statement (2) since $0 \leq \frac{X_i}{1+X_i} \leq X_i \leq 2\frac{X_i}{1+X_i}$ (since $2\geq 1+X_i$).

air
  • 2,871