Forgive me if this question seems stupid because I'm not very good with limits. Please tell me what is wrong with the following approach:
Let $X_n$ be a sequence of random variables that converges in probability to a random variable $X$ Then by definition: $$\lim\limits_{n→∞} P(|X_n − X| \ge\epsilon) \to 0,$$ where $\epsilon>0.$
And we know that, $$P(|X_n − X| \ge ε ) + P (|X_n − X| < ε)=1.$$
Applying limits,
$$\lim\limits_{n\to \infty} P(|X_n − X| \ge \epsilon) + \lim\limits_{n\to \infty}P(|X_n − X| < ε) = 1.$$
Therefore, $\lim\limits_{n\to \infty}P(|X_n − X| <ε) = 1.$
And hence, $P({ \omega \in \Omega : \lim\limits_{n\to \infty}X_n(\omega) = X(\omega)})=1.$
Where am I wrong? Please note that I'm not asking for a counterexample and I'm asking where my proof is wrong.