I want to solve the following exercise, but I'm having problems handling all the information given.
Let $Y_n $ be a sequence of real random variables and $m, σ > 0$ be real numbers such that as $n \rightarrow +\infty$ $$\sqrt{n}(Y_n - m) \implies\mathscr{N}(0,σ^2)$$
Moreover, we have that $Y_n = (X_1 +...+X_n)/n$, where $X_i$ are i.i.d. random variables with mean $m$ and variance $σ^2$. We know that $Y_n$ converges to $m$ in probability.
Now, let $g: \mathbb{R}\rightarrow \mathbb{R}$ be a continuous function and suppose that $g$ is differentiable at $m$.
Let $\Delta:\mathbb{R}\rightarrow\mathbb{R}$ be the following function:$$\Delta: x \rightarrow \begin{cases} \frac{g(x)-g(m)}{x-m}, & \text{if $x \neq m$} \\ g'(x), & \text{if $x = m$} \end{cases}$$ Show that $\Delta(X_n)$ converges in probability to $g'(m)$.
Conclude that $\sqrt{n}(g(Y_n)-g(m))\implies \mathscr{N}(0,(g'(m))^2σ^2)$
If somebody could help me at least how to start it would be nice because I feel like it isn't that difficult, but I am confused and I don't really know how to approach the problem at the moment. Thanks.