0

I want to solve the following exercise, but I'm having problems handling all the information given.

Let $Y_n $ be a sequence of real random variables and $m, σ > 0$ be real numbers such that as $n \rightarrow +\infty$ $$\sqrt{n}(Y_n - m) \implies\mathscr{N}(0,σ^2)$$

Moreover, we have that $Y_n = (X_1 +...+X_n)/n$, where $X_i$ are i.i.d. random variables with mean $m$ and variance $σ^2$. We know that $Y_n$ converges to $m$ in probability.

Now, let $g: \mathbb{R}\rightarrow \mathbb{R}$ be a continuous function and suppose that $g$ is differentiable at $m$.

  1. Let $\Delta:\mathbb{R}\rightarrow\mathbb{R}$ be the following function:$$\Delta: x \rightarrow \begin{cases} \frac{g(x)-g(m)}{x-m}, & \text{if $x \neq m$} \\ g'(x), & \text{if $x = m$} \end{cases}$$ Show that $\Delta(X_n)$ converges in probability to $g'(m)$.

  2. Conclude that $\sqrt{n}(g(Y_n)-g(m))\implies \mathscr{N}(0,(g'(m))^2σ^2)$

If somebody could help me at least how to start it would be nice because I feel like it isn't that difficult, but I am confused and I don't really know how to approach the problem at the moment. Thanks.

StubbornAtom
  • 17,932
Miresh
  • 175
  • 3
    By the way, this is known as the "delta method" (see e.g. https://math.stackexchange.com/questions/2435118/proof-of-the-delta-method). – Minus One-Twelfth May 19 '20 at 13:15
  • @MinusOne-Twelfth Thank you, I didn't know it was called like this. I'll give a look at the link and other sources – Miresh May 19 '20 at 16:33
  • One route to this uses a form of Slutzsky's Theorem: If $(A_n,B_n)$, $n=1,2,\ldots$ is a sequence of pairs of random variables all defined on a common probability space, and if $A_n$ converges in distribution to a r.v. $A$ while $B_n$ converges in probability to a constant $b$, then $A_n\cdot B_n$ converges in distribution to $A\cdot b$. – John Dawkins May 20 '20 at 17:05

0 Answers0