1

Suppose $\{X_n\}$ is i.i.d. with $X_n > 0$ and each $X_n$ is not constant almost surely.

I want to show that if $\mathbb{E}X_n = q \leq 1$, then the product $\prod\limits_{n=1}^{+\infty}X_n \to 0$ almost surely. In the problem statement, there is also a hint that $q < 1$ and $q = 1$ can be considered separately.

I thought about this one for a long time and I think applying logarithm operation to both sides and transform this product into a sum might help, but I couldn't progress from there. I'm still unable to see how I can reduce this problem to using laws of large numbers.

I would appreciate any help!

Swistack
  • 802
  • @GEdgar The problem is that I need to prove that $\sum\limits_{n=1}^{+\infty} \ln{X_n} \to -\infty$ when all the laws of large numbers I know would be about convergence of $\dfrac{1}{N}\sum\limits_{n=1}^N\ln{X_n} \to 0$ when $N \to +\infty$ – Swistack Feb 04 '21 at 21:28
  • @GEdgar Of course, that was a typo on my side. What I don't understand is how it's going to help to prove $\sum\limits_{n=1}^{+\infty} \ln{X_n} \to -\infty$ (which is an equivalent form of what needs to be proved in this problem). – Swistack Feb 04 '21 at 21:36
  • @GEdgar Indeed, because $\sum\limits_{n=1}^N t_n \sim N\mu$ asymptotically. I got it now! I wonder what happens when $\mu = 0$? – Swistack Feb 04 '21 at 21:55
  • 1
    Use $X_n$ is not constant to prove $\mu < 0$. – GEdgar Feb 04 '21 at 22:04
  • 1
    Are you familiar with martingales? In the case $q=1$, it may be helpful to note that $M_n:=\prod_{i=1}^nX_i$ is a positive martingale. – Jason Feb 04 '21 at 22:10
  • @GEdgar Is that true? If $X_n$ is, for example, 1/2 or 3/2, both with probabilities 1/2, then $EX_n=1$ and $\ln E(X_1) = 0$. – Swistack Feb 04 '21 at 22:20
  • @Jason No, I'm not. Maybe there's a more elementary proof? – Swistack Feb 04 '21 at 22:22
  • 1
    In that case proceed with @GEdgar's advice. Note that $\mu = \mathbb E[\log X_1]$. Use Jensen's inequality to compare this to $\log\mathbb E[X_1]$. Then use the fact that $X_1$ is not constant and your knowledge of when Jensen's inequality is an equality. – Jason Feb 04 '21 at 22:28
  • @Jason Thank you! But didn't I present a counterexample to GEdgar comment? – Swistack Feb 04 '21 at 22:35
  • @Jason Of course, it wasn't a counterexample. Thank you, everything worked out. – Swistack Feb 04 '21 at 22:53

2 Answers2

1

Since $X_{n}>0$, we have that $q=E[X_{n}]>0$. Define the process $M=\{M_{n}\mid n\in\mathbb{N}\}$ by $M_{n}=\prod_{k=1}^{n}\frac{X_{k}}{q}.$ Let $\mathbb{F}=\{\mathcal{F}_{n}\mid n\in\mathbb{N}\}$ be the raw-filtration induced by $M,$ i.e., $\mathcal{F}_{n}=\sigma\{M_{1},M_{2},\ldots,M_{n}\}$. Note that we also have $\mathcal{F}_{n}=\sigma\{X_{1},X_{2},\ldots,X_{n}\}.$ We go to show that $M$ is a $L^{1}$-bounded $\mathbb{F}$-martingale. Clearly $M_{n}$ is integrable (recall that the product of two independent random variables is integrable). Moreover, since $X_{n+1}$ and $\mathcal{F}_{n}$ are independent while $M_{n}$ is $\mathcal{F}_{n}$-measurable, we have that \begin{eqnarray*} & & E\left[M_{n+1}\mid\mathcal{F}_{n}\right]\\ & = & E\left[\frac{X_{n+1}}{q}M_{n}\mid\mathcal{F}_{n}\right]\\ & = & M_{n}E\left[\frac{X_{n+1}}{q}\mid\mathcal{F}_{n}\right]\\ & = & M_{n}E\left[\frac{X_{n+1}}{q}\right]\\ & = & M_{n}. \end{eqnarray*} Moreover, since $M_{n}>0$, we clearly have $E\left[|M_{n}|\right]=E\left[M_{n}\right]=E[M_{1}]=E[X{}_{1}/q]=1$. This shows that $\sup_{n}E\left[|M_{n}|\right]<\infty.$

By Martingale Convergence Theorem, there exists a $\mathcal{F}_{\infty}$-measurable random variable $\xi$ with $E[|\xi|]<\infty$ such that $M_{n}\rightarrow\xi$ pointwisely a.e. (Note that, in general, we do not have $M_{n}\rightarrow\xi$ in $L^{1}$).

Case 1: $q<1$. We have that $\prod_{k=1}^{n}X_{n}=q^{n}M_{n}\rightarrow0$ a.e. because $q^{n}\rightarrow0$ while $M_{n}\rightarrow\xi$ a.e.

Case 2: Define $a=E\left[\sqrt{X_{n}}\right].$ By Cauchy-Schwarz inequality, we have that \begin{eqnarray*} a & = & \int\sqrt{X_{n}}\cdot1dP\\ & \leq & \left\{ \int X_{n}dP\right\} ^{\frac{1}{2}}\left\{ \int1^{2}dP\right\} ^{\frac{1}{2}}\\ & = & 1 \end{eqnarray*} and the equality holds iff $\sqrt{X_{n}}$ and $1$ are linearly dependent (as elements in $L^2$, which is false since $X_{n}$ is not constant a.e.). Therefore, $0<a<1.$

Define a process $Y=\{Y_{n},\,\,\,n\in\mathbb{N}\}$ by $Y_{n}=\frac{\sqrt{M_{n}}}{a^{n}}.$ It can be proved similarly that $Y$ is a $\mathbb{F}$-martingale. For, by Cauchy-Schwarz inequality, $M_{n}$ is integrable $\Rightarrow$ $\sqrt{M_{n}}$ is integrable. Clearly $Y_{n}$ is $\mathcal{F}_{n}$-measurable. Moreover, \begin{eqnarray*} & & E\left[Y_{n+1}\mid\mathcal{F}_{n}\right]\\ & = & E\left[\frac{\sqrt{X_{n+1}}}{a}\cdot Y_{n}\mid\mathcal{F}_{n}\right]\\ & = & Y_{n}E\left[\frac{\sqrt{X_{n+1}}}{a}\mid\mathcal{F}_{n}\right]\\ & = & Y_{n}E\left[\frac{\sqrt{X_{n+1}}}{a}\right]\\ & = & Y_{n}. \end{eqnarray*} Since $Y$ is non-negative, we have that $E\left[|Y_{n}|\right]=E\left[Y_{n}\right]=E\left[Y_{1}\right].$ It follows that $\sup_{n}E\left[|Y_{n}|\right]<\infty$, i.e., $Y$ is $L^{1}$-bounded. By Martingale Convergence Theorem again, there exists an integrable random variable $\eta$ such that $Y_{n}\rightarrow\eta$ a.e.. Recall that $M_{n}\rightarrow\xi$ a.e. and notice that $M_{n}=a^{2n}Y_{n}^{2}$. Letting $n\rightarrow\infty$, we have that $\xi=0$ a.e. because $a^{2n}\rightarrow0$ while $Y_{n}^{2}\rightarrow\eta^{2}<\infty$ a.e.

1

Apologies for my first hasty attempt, perhaps this will be more satisfactory. As you noted, the problem is equivalent to showing that $\sum_i \log(X_i) = -\infty$ a.s.

Firstly, since the logarithm is concave and not affine and $X_i$ is not a.s. constant, one has by Jensen's inequality that

$$0 = \log(1) \geq \log q = \log(E[X_i]) > E[\log(X_i)] := \mu.$$

You can see, e.g. Convexity and equality in Jensen inequality for when Jensen's inequality can be an equality. In our case, it must be strict inequality, which we used in the previous line.

Thus it suffices to show (set $Y_i := -\log(X_i)$) the following:

Theorem. If $Y_i$ are iid with $\mu := E[Y_i] > 0$, then $\sum_i Y_i = \infty$ a.s.

Lemma. For any real valued $(y_i)_i$, if $\frac{1}{n} \sum_{i=0}^n y_i \to \mu > 0$ then $\sum_i y_i = \infty$.

Proof of lemma. Let any subsequence $n_k \to \infty$ be given. Then $$\sum_{i=0}^{n_k} y_i =(\frac{1}{n_k}\sum_{i=0}^{n_k} y_i) n_k \to \mu \times \infty = \infty.$$ In particular this shows $\liminf_n \sum_{i=1}^{n} y_i = \infty$, so $\sum_i y_i = \infty$. QED.

Proof of theorem. If $\mu < \infty$ then by the law of large numbers $\frac{1}{n}\sum_{i=0}^n Y_i \to \mu >0$ a.s. By the lemma, $\sum_i Y_i = \infty$ a.s. in this case.

Now consider the case $\mu = \infty$. Considering that $E[Y_i] > -\infty$ and writing $Y_i = Y_i^+ - Y_i^-$ with $Y_i^+ = \max(Y_i, 0)$ and $Y_i^-= \max(-Y_i, 0)$ the usual decomposition of $Y_i$ into its positive and negative parts, it must be that $E[Y_i^-] \in [0,\infty)$ and $E[Y_i^+] = \infty$. In particular $$ E[\max(Y_i, n)] = E[\max(Y_i^+,n) - Y_i^-] = E[\max(Y_i^+,n)] - E[Y_i^-] \to \infty $$ by monotone convergence of $\max(Y_i^+,n) \to Y_i^+$. In particular we may choose $N$ large enough that $0 < E[\max(Y_i, N)] \leq N$. The random variables $(\max(Y_i, N))_i$ are iid with mean in $(0,\infty)$, so by what we showed earlier $\sum_i \max(Y_i, N) = \infty$ a.s. Then it follows that

$$ \infty = \sum_i \max(Y_i, N) \leq \sum_i Y_i $$ a.s., QED.

nullUser
  • 28,703
  • 1
    The Lemma can be proved directly without invoking concepts like $\liminf$. Denote $S_n = \sum_{k=1}^n y_k$ and $A_n=\frac{1}{n}\sum_{k=1}^n y_k$. Note that $S_n = nA_n$. Since $A_n\rightarrow \mu>0$, therefore $S_n\rightarrow +\infty$. – Danny Pak-Keung Chan Feb 05 '21 at 19:53