11

The question is:

Let $Y_1 , Y_2, \dots$ be nonnegative i.i.d. random variables with $\mathbb{E}Y_m = 1$ and $\mathbb{P} (Y_m = 1) < 1$. (i) Show that $X_n = \prod_{m \le n} Y_m$ defines a martingale. (ii) Use an argument by contradiction to show $X_n \to 0$ a.s.

(i) is easy to check. For (ii), by Martingale Convergence Theorem, we can show that $X_n$ converges to almost surely to some $X$ with $\mathbb{E}X \le \mathbb{E}X_0 = 1.$ ($X_0$ is not explicitly defined in the question, but to make $X_n$ a martingale, we need $X_0 = 1$.)

My guess is that $X = 0$ almost surely must comes from the fact $\mathbb{P} (Y_m = 1) < 1$. But I can't see how to continue from here.

Did
  • 284,245
LeafGlowPath
  • 7,683
  • 3
    Note that if $X_n$ converges to a positive value then $X_n/X_{n-1}=Y_n$ converges to $1$. – Colin McQuillan Feb 20 '13 at 17:42
  • @ColinMcQuillan You should post this as an answer. –  Feb 20 '13 at 18:19
  • 1
    See also Theorem 13.2.3. here: www.statslab.cam.ac.uk/~james/Lectures/ap.pdf –  Feb 20 '13 at 18:41
  • @ColinMcQuillan But couldn't there exists a set $A = {\omega:Y_n(\omega) \to 1}$ such that $\mathbb P{A} > 0$? – LeafGlowPath Feb 20 '13 at 19:20
  • 1
    @ablmf: no: $\mathbb{P}[Y_n\to 1] = 0$. Try showing that there exists $\epsilon>0$ such that for all $N$ we have $\mathbb{P}[|Y_n-1| < \epsilon\text{ for all $n>N$}] = 0$. – Colin McQuillan Feb 20 '13 at 20:31
  • @ColinMcQuillan I see! Because $Y_1,Y_2,..$ are i.i.d., so we can find an $\epsilon > 0$ such that $\mathbb P[|Y_1 - 1| ~> \epsilon] < 1$ for all $n$. Therefore $\mathbb P[Y_n = 1] = 0$. So here $i.i.d.$ is a necessary condition. – LeafGlowPath Feb 20 '13 at 21:31
  • @ColinMcQuillan How can $P[Y_n \rightarrow 1]=0$?? – kayak May 31 '17 at 13:08

2 Answers2

8

Why an argument by contradiction? Note that $\log X_n$ is the sum of $n$ i.i.d. random variables with mean $m=\mathbb E(\log Y_1)$, hence, if $m\lt0$, by the strong law of large numbers, $\log X_n\to-\infty$.

But $m\leqslant\log\mathbb E(Y_1)=0$ by Jensen inequality and one knows that this convexity inequality is strict as soon as the random variable $Y_1$ is not almost surely constant. This is what the hypothesis $\mathbb P(Y_1=1)\lt1$ ensures. Hence, $m\lt0$, $\log X_n\to-\infty$ almost surely, and $X_n\to0$ almost surely.

Did
  • 284,245
  • Do we need to worry about the case when $E(|\log(Y)|)=\infty$? –  Feb 20 '13 at 18:23
  • 1
    @ByronSchmuland Good question. The answer is no because $\log^+Y_1\leqslant Y_1$ hence one would have $E(\log^+Y_1)$ finite and $E(\log^-Y_1)$ infinite, in which case the SLLN applies. – Did Feb 20 '13 at 18:26
  • Right you are!${ }$ –  Feb 20 '13 at 18:29
  • after reading the comments, I am still confused why we do not need to worry about $E(|log(Y)|) = \infty$ – lll Oct 27 '19 at 21:54
  • I am not sure why $m < 0$ implies $\log X_n \rightarrow -\infty$ a.s.. – Gavin Feb 10 '22 at 04:47
3

The Hewitt-Savage zero one law says that $X$ is almost surely a constant. Also, $X=Y_1\cdot\prod_{i=2}^\infty Y_i$ has the same distribution as $Y_1\cdot X$. Since $Y_1$ is not constant almost surely, this forces $X=0$.