3

Let $X_1,X_2,\ldots$ be a sequence of random variables.

Weak (strong) law of large numbers states that:

If $X_1,X_2,\ldots$ are i.i.d. RVs and they have finite expectation $m$, then $\frac{X_1+\dots+X_n}{n}\rightarrow m$ stochastically (almost surely).

I wonder if those laws hold without assumption about independence/identical distribution or if we can exchange one assumption with some other one. Thanks for any input.

luka5z
  • 6,571
  • 2
    you can replace iid by pairwise independent . Etemadi gave a proof for this in 1981 http://link.springer.com/article/10.1007%2FBF01013465 on the PTRF journal ! – Dave Nguyen Jul 18 '15 at 17:09
  • 2
    See this question for a version of the strong law with some dependence assumptions.

    Generally you can easily prove the strong law by Chebyshev's inequality if you assume a fourth moment exists, so in doing this calculation, you can get away with both some dependence and even different distributions.

    – Alex R. Jul 18 '15 at 17:09

2 Answers2

2

A theorem by Markov states that if a sequence of random variables $X_1, X_2, \ldots$ with finite variances fulfills one of conditions:

  • $\lim_{n \to \infty} \frac{1}{n^2} \mathrm{Var} \sum_{i = 1}^n X_n = 0$;
  • $X_1, X_2, \ldots$ are independent and $\lim_{n \to \infty}\frac{1}{n^2}\sum_{i = 1}^n \mathrm{Var} X_i = 0$;

then the sequence $Y_n = \frac{1}{n}\sum_{i=1}^n (X_i - \mathsf{E} X_i)$ converges for $n \to \infty$ to $0$ in probability.

In addition, if random variables $X_1, X_2, \ldots$ are identically distributed, have finite variance and are uncorrelated (instead of independent), then the proof of the weak law of large numbers using Chebyshev's inequality still holds.

EDIT: Corrected the first condition, thanks to @Michael.

Budenn
  • 783
  • Thanks for interesting answer, what about strong law? And what is $\mu$ in this case? – luka5z Jul 18 '15 at 17:33
  • 1
    $\mu$ is a leftover from extra condition that $X_1, X_2, \ldots$ have the same expectation $\mu$. I edited it out. For the strong law, when the results mentioned in the comments to the question do not apply, I suspect, stricter constraints on variances or other absolute moments (similar to those in http://math.stackexchange.com/a/1351286/250821) are necessary. – Budenn Jul 18 '15 at 17:53
  • This is false. The first condition is not sufficient for convergence, even if we augment that condition by assuming ${X_i}_{i=1}^{\infty}$ are mutually independent. See https://math.stackexchange.com/questions/3975818/law-of-large-numbers-without-independence-and-identical-distributed-assumption – Michael Jan 07 '21 at 13:42
1

As the other answer has used independence, I want to present a version which doesn't need any independence at all.

Let $X_n$ be random variables fulfilling:

  • $ \mathbb{E} [X_n^2] \leq C $ for some constant $C$ not depending on n.
  • $ | Cov(X_m,X_n) | \leq r(| m - n |) $, with $ r : \mathbb{N_0} \mapsto [0,\infty) $ such that $ \lim\limits_{k \to \infty} r(k) = 0 $

Then, $$ \frac 1 n \sum\limits_{l=1}^n (X_l - \mathbb{E}[X]) \to 0 $$ in probability and $L^2$ as $n \to \infty$.

Thus the weak law also holds if you have random variables which are not uncorrelated but merely decorrelate.