1

Forgive me if this question seems stupid because I'm not very good with limits. Please tell me what is wrong with the following approach:

Let $X_n$ be a sequence of random variables that converges in probability to a random variable $X$ Then by definition: $$\lim\limits_{n→∞} P(|X_n − X| \ge\epsilon) \to 0,$$ where $\epsilon>0.$

And we know that, $$P(|X_n − X| \ge ε ) + P (|X_n − X| < ε)=1.$$

Applying limits,

$$\lim\limits_{n\to \infty} P(|X_n − X| \ge \epsilon) + \lim\limits_{n\to \infty}P(|X_n − X| < ε) = 1.$$

Therefore, $\lim\limits_{n\to \infty}P(|X_n − X| <ε) = 1.$

And hence, $P({ \omega \in \Omega : \lim\limits_{n\to \infty}X_n(\omega) = X(\omega)})=1.$

Where am I wrong? Please note that I'm not asking for a counterexample and I'm asking where my proof is wrong.

Raghav
  • 3,141
  • 4
    Your error is in the very last step. You cannot just move the limit inside the probability. The reason for this is best illuminated by a counterexample, despite your explicitly asking not to give you one. – angryavian Apr 24 '20 at 17:34
  • Use MathJax for typesetting math. – StubbornAtom Apr 24 '20 at 19:19
  • Almost surely convergence is equivalent to, for all $\epsilon>0$, $$\lim_{n\rightarrow\infty}P[\cup_{i=n}^{\infty} |X_i-X|<\epsilon]=1$$ which is much stronger than your result $\lim_{n\rightarrow\infty} P[|X_n-X|<\epsilon]=1$. – Michael Apr 25 '20 at 01:28

0 Answers0