26

I'm interested in knowing what is the expected value of the norm of a vector obtained from a gaussian distribution in function of the number of dimensions $N$ and $\sigma$, i.e:

$$E[\|x\|_2],\quad x\sim\mathcal{N}(0,\sigma I_N)$$

I tried to search for this but didn't find anything. Can I get some help from you?

SMA.D
  • 1,497
jmacedo
  • 363

3 Answers3

18

This amounts to integration in spherical coordinates $(r=\|x\|)$: $$ E(\|x\|) = \frac{1}{(\sqrt{2\pi} \sigma)^N } \frac{N\pi^{N/2}}{\Gamma\big(\frac{N}{2}+1\big)}\int_0^\infty e^{-r^2/(2\sigma^2)} r^{N-1} \,dr \tag1$$

This is not so bad: substitute $t=r^2/(2\sigma^2)$, so that $dt = r/\sigma^2$. The resulting integral gives Euler's gamma function $\Gamma$. I'll skip the boring cancellations and get to the result: $$ E(\|x\|) = \frac{\sqrt{2}\, \Gamma\big(\frac{N+1}{2}\big)}{\Gamma\big(\frac{N }{2}\big)}\,\sigma $$ As stated in this paper, where you can also find the inequalities $$ \frac{N}{\sqrt{N+1}}\le \sigma^{-1}E(\|x\|)\le \sqrt{N} $$

  • Thanks. Times $\sigma$? I guess so, since for 1 dimension I got this – jmacedo Jun 09 '14 at 17:19
  • @joxnas Right, I internally normalized $\sigma=1$ and forgot to include it in the answer. –  Jun 09 '14 at 17:40
  • @Yes: I am confused regarding (1). In the $N=2$ case the Jacobian is $r(N−1)=r$. Also, in the two dimensional case the corresponding integral would be $E[|X|]=...\int...r^2dr$ but you have $...\int..rdr. Please help meunderstand. – zoli 21 hours ago – zoli May 07 '15 at 08:43
12

The above answer contains mistakes, as has been noted in the comments. I needed recently to derive this so the general result is: $$\mathbb{E}\left[||x||_2^n\right] = 2^\frac{n-2}{2}\sigma^n N \frac{\Gamma\left(\frac{N+n}{2}\right)}{\Gamma\left(\frac{N+2}{2}\right)}$$

Alex Botev
  • 1,146
  • I don't have necessary knowledge to assert which of these solutions is the correct one. Perhaps someone with some reputation can confirm this is correct and the above solution is wrong? – jmacedo Jan 06 '20 at 17:21
  • @jmacedo Both mine and Liyuan's are correct, in fact there are equivalent as I mentioned in a comment below his answer, feel free to choose any of the two answers. – Alex Botev Jan 06 '20 at 21:06
  • what about when x ~ Uniform[-u, u]? – Song Jul 04 '24 at 08:33
8

When $\sigma=1$, this is the first moment of Chi-distribution. Furthermore, \begin{equation} \mathbb{E}_{x\sim\mathcal{N}(0,\sigma^2I)}\{ \|x\|_2^k \} = \mathbb{E}_{x\sim\mathcal{N}(0,I)}\{ \|\sigma x\|_2^k \}= \sigma^k \mathbb{E}_{x\sim\mathcal{N}(0,I)}\{ \| x\|_2^k \}, \end{equation} where $\mathbb{E}_{x\sim\mathcal{N}(0,I)}\{ \| x\|_2^k \}$ is the $k$th moment of Chi-distribution, which has value \begin{equation} \mathbb{E}_{x\sim\mathcal{N}(0,I)}\{ \|x\|_2^k \} = 2^{k/2} \frac{\Gamma((N+k)/2)}{\Gamma(N/2)}. \end{equation}

Two comments: 1. I think $x\sim\mathcal{N}(0,\sigma I_N)$ should be replaced by $x\sim\mathcal{N}(0,\sigma^2I_N)$ in the original post. 2. This result is different from that of Botev's. Let me know if I made any mistakes.