3

I'm trying to verify if my analysis is correct or not.

Suppose we have $m$ random variables $x_i$ , $i \in m$. Each $x_i \sim \mathcal{N}(0,\sigma^2)$.

From extreme value theorem one can state $Y= \max\limits_{i \in m} [\mathcal{P}(x_i \leq \epsilon)] = [G(\epsilon)]$ as $m\to\infty$, if $x_i$ are i.i.d and $G(\epsilon)$ is a standard Gumbel distribution.

My first question is can we state that: $$\text{Var}[Y]= \text{Var}\left[\max_{i \in m} [\mathcal{P}(x_i \leq \epsilon)] \right]= \text{Var}[ [G(\epsilon)]] = \frac{\pi^2}{6}$$

My second question is, if we have $n$ of such $Y$ but all of them are independent with zero mean, can we state: $$\text{Var}\left[\prod_{i}^n Y_i\right] = \left(\frac{\pi^2}{6}\right)^n$$

Thanks.

Update:
There's final result for the second point at Distribution of the maximum of a large number of normally distributed random variables but no complete step by step derivation.

Ѕᴀᴀᴅ
  • 35,369
xsari3x
  • 207
  • For the first question, can you check why does variance not contain $\sigma$ term? Intuitively, it should have a $\sigma$ term and variance should increase with $\sigma$ is my guess. However, the second equation is wrong. For random variable $X$ and $Y$, $Var(XY)\neq Var(X)Var(Y)$ – Shiv Tavker May 09 '20 at 09:40
  • @ShivTavker for the 1st Q: The variance of a standard gumbel distribution is $\pi^2/6$ For the second one, do you have a suggestion of how to tackle it? – xsari3x May 10 '20 at 05:04
  • I think I found a lead here https://math.stackexchange.com/questions/2035079/distribution-of-the-maximum-of-a-large-number-of-normally-distributed-random-var but trying to understand it much better – xsari3x May 10 '20 at 07:28

1 Answers1

0

$\def\dto{\stackrel{\mathrm{d}}{→}}\def\peq{\mathrel{\phantom{=}}{}}$The answers to question 1 and 2 are both negative.

For question 1, since $Y_m = \max\limits_{i \in m} P(x_i \leqslant ε) \dto G(ε)$, then $\color{blue}{\lim\limits_{m → ∞}} D(Y_m) = D(G(ε))$, i.e. the equality is in the sense of a limiting process.

For question 2, it's geneally not true that$$ D\left( \prod_{m = 1}^n Y_m \right) = \prod_{m = 1}^n D(Y_m) $$ for i.i.d. $Y_1, \cdots, Y_n$, especially when $E(Y_1) ≠ 0$ and $D(Y_1) > 0$ by the following proposition:

Proposition: If $X$ and $Y$ are independent random variables on the same probability space, then$$ D(XY) - D(X) D(Y) = D(X) (E(Y))^2 + D(Y) (E(X))^2. $$

Proof: Since $X$ and $Y$ are independent, then\begin{gather*} D(XY) = E(X^2 Y^2) - (E(XY))^2 = E(X^2) E(Y^2) - (E(X) E(Y))^2,\\ D(X) D(Y) = \left( E(X^2) - (E(X))^2 \right) \left( E(Y^2) - (E(Y))^2 \right)\\ = E(X^2) E(Y^2) - E(X^2) (E(Y))^2 - E(Y^2) (E(X))^2 + (E(X))^2 (E(Y))^2, \end{gather*} and\begin{align*} &\peq D(XY) - D(X) D(Y) = E(X^2) (E(Y))^2 + E(Y^2) (E(X))^2 - 2 (E(X))^2 (E(Y))^2\\ &= \left( E(X^2) - (E(X))^2 \right) (E(Y))^2 + \left( E(Y^2) - (E(Y))^2 \right) (E(X))^2\\ &= D(X) (E(Y))^2 + D(Y) (E(X))^2. \tag*{$\square$} \end{align*}

Now it can be proved by induction on $n$ with the above proposition that$$ D\left( \prod_{m = 1}^n Y_m \right) > \prod_{m = 1}^n D(Y_m) > 0. $$

Ѕᴀᴀᴅ
  • 35,369
  • thanks for your answer, then can we use the results https://math.stackexchange.com/questions/2035079/distribution-of-the-maximum-of-a-large-number-of-normally-distributed-random-var#comment4180323_2035079 for the first question? – xsari3x May 24 '20 at 05:43
  • I'm looking for a step by step derivation for the first point correct results which can be found here https://math.stackexchange.com/questions/2035079/distribution-of-the-maximum-of-a-large-number-of-normally-distributed-random-var#comment4180323_2035079 thanks alot for your effort! – xsari3x May 24 '20 at 05:53