I find the theorem below on https://math.stackexchange.com/a/1366549/533565. But I cannot find it at neither the textbook I have nor the published paper in google scholar. I want to cite this theorem in my project. Does anybody know where is the theorem from?
A theorem by Markov states that if a sequence of random variables $X_1, X_2, \ldots$ with finite variances fulfills one of conditions:
- $\lim_{n \to \infty} \frac{\mathrm{Var} X_n}{n^2} = 0$;
- $X_1, X_2, \ldots$ are independent and $\lim_{n \to \infty}\frac{1}{n^2}\sum_{i = 0}^n \mathrm{Var} X_i = 0$;
then the sequence $Y_n = \frac{1}{n}\sum_{i=1}^n (X_i - \mathsf{E} X_i)$ converges for $n \to \infty$ to $0$ in probability.