5

Let $X_i\sim \text{iid}\, \mathcal{N}(\mu,\sigma)$ for $i\in\{1,\dots,n\}$. I am interested in the random variable $Y=\max_i{X_i}$ when $n$ is large. From Extreme value theory it seems that $Y$ would follow a Gumbel distribution but I would like to know the parameters of this distribution as a function of $\mu$ and $\sigma$.

Also, would the result holds if $\mu$ and $\sigma$ differ across $i$ or if the $X_i$ are not independent?

user_lambda
  • 1,452
  • 2
    Recall that the standard normal CDF $\Phi$ is such that $$1-\Phi(y)\sim\frac{e^{-y^2/2}}{y\sqrt{2\pi}}$$ when $y\to\infty$ and that, for every $y$ $$P(Y_n\leqslant\mu+y\sigma)=\Phi(y)^n$$ hence, if $(y_n)$ is chosen such that $$n\frac{e^{-y_n^2/2}}{y_n\sqrt{2\pi}}=t$$ for some given $t$, then $$P(Y_n\leqslant\mu+y_n\sigma)\to e^{-t}$$ Solving this for $y_n$ yields $$y_n=\sqrt{2\ln n}-\frac{\ln\ln n+2\ln t+\ln(4\pi)}{2\sqrt{2\ln n}}$$ Equivalently, defining $$Z_n=\sqrt{2\ln n}\frac{Y_n-\mu}\sigma-2\ln n+\frac12\ln\ln n+\frac12\ln(4\pi)$$ yields the (well known) convergence ... – Did Nov 29 '16 at 19:46
  • 1
    ... in distribution $$Z_n\to Z$$ where, for every real $z$, $$P(Z\leqslant z)=\exp(-e^{-z})$$ – Did Nov 29 '16 at 19:46
  • Thanks @Did that's useful. So I guess $$\text{Var}(Z_n)=\frac{2\ln n}{\sigma^2}\text{Var}(Y_n)$$ and since $\text{Var}(Z_n)=\pi^2/6$ we can get the variance of $Y_n$ this way? By the way, if you want to post your note as an answer I would be happy to accept it. – user_lambda Nov 29 '16 at 22:08
  • You mean, Var$(Z)=\pi^2/6$, not Var$(Z_n)=\pi^2/6$? – Did Nov 29 '16 at 22:13
  • 1
    Yes. So for $n$ large enough we would find $$ \text{Var}(Y_n)=\frac{\pi^{2}}{6}\frac{\sigma^{2}}{2\ln n}$$ – user_lambda Nov 29 '16 at 22:15
  • 1
    Not equal, at most an equivalent (even for large values of $n$). – Did Nov 29 '16 at 22:16
  • @Did can you elaborate more on moving from the $\text{Var}(Z_n)$ to $\text{Var}(Y_n)$ , like a completer proof for a beginner like me :) – xsari3x May 13 '20 at 05:24
  • 2
    This post on the stats site is relevant: https://stats.stackexchange.com/q/105745/119261. Also see the linked threads. – StubbornAtom May 31 '20 at 19:46

1 Answers1

1

Hint: take a look at the Fisher–Tippett–Gnedenko theorem

Momo
  • 16,300
  • 1
    Yes, I know this result. My question is what are the $a_n$ and $b_n$. For instance, I would like to know the variance of the distribution $Y$. – user_lambda Nov 29 '16 at 18:41