14

My professor has given a list of questions that will not be appearing on my test, with this being one of them. I still feel this is extremely important to understand.

How can I prove the following

If $X$ and $Y$ are independent, standard normal random variables, then the linear combination $aX+bY,\;\forall a,b>0$ is also normally distributed.


If I am not mistaken, I believe I can find the distribution of the linear combination

If we let $Z=aX+bY$, knowing $X,Y \sim N(0,1)$, we can find the expectation and variance as $$\mathbb{E}(Z)=\mathbb{E}(aX+bY)=a\mathbb{E}(X)+b\mathbb{E}(Y)=0$$ $$Var(Z)=Var(aX+bY)=a^2Var(X)+b^2Var(Y)=a^2+b^2$$ $$$$ Thus, $Z \sim N(0,a^2+b^2)$.

I just don't think this proves that linear combination is normally distributed. I tried looking in some reference books that my professor reserved at the library, but they all just state the fact and I can't figure out how to prove it.

MathIsFun
  • 141
  • 1
  • 1
  • 3
  • 3
    Disclaimer: You will most likely waste time better invested in learning for your test if you keep reading now.

    With that out of the way, a really nice geometric argument using the rotation invariance of the joint density function of two independent random variables is found here. (Why Is the Sum of Independent Normal Random Variables Normal? B. Eisenberg and R. Sullivan, The Mathematical Magazine, Vol. 81, No. 5, December 2008)

    – binkyhorse Oct 07 '14 at 18:05
  • 2
    @binkyhorse That is the most elegant proof of this I've seen! – Christoph Oct 21 '15 at 10:44
  • 1
    Since this question is recently active, I'm adding the Wayback Machine'd version of the link provided by @binkyhorse above: http://web.archive.org/web/20210506193634/http://www.maa.org/sites/default/files/Eisenberg12-0825740.pdf – Brian Tung Sep 12 '24 at 05:24

2 Answers2

4

The most direct way is to look at the characteristic function.

The characteristic function of an r.v. characterizes its probability distribution completely. If you can show that the characteristic function of the linear combination is that of a normal r.v., you are done.

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355
parsiad
  • 25,738
  • The questions my professor posted were "highly considered for the test" and we have never used characteristic functions in my course. So I don't think he would have expected us to use them to prove it on the test. Is there a different way to prove it without the use of them? – MathIsFun Oct 07 '14 at 03:33
  • You can look at the CDF of the linear combination directly, but it's a little more involved. This is one the same page in the Wikipedia article: https://en.wikipedia.org/wiki/Sum_of_normally_distributed_random_variables#Geometric_proof – parsiad Oct 07 '14 at 03:36
  • @MathIsFun it is also possible to go through the same proof using moment generating functions rather than characteristic functions. Perhaps these are more familiar. – Ben Grossmann Oct 07 '14 at 03:51
  • Haven't seen those either. I don't exactly know what is expected since we aren't getting any solutions. I think the geometric case listed above makes some sense but I haven't done change of coordinates on integrals in about 4 years so maybe I'm too rusty in that regard to full grasp the proof. – MathIsFun Oct 07 '14 at 04:09
  • 2
    I would just take the time to read up on the characteristic function. It is an indispensable tool in probability. – parsiad Oct 07 '14 at 04:10
3

Here's an elementary derivation using the Jacobian (change of variables) approach. Let $X$ and $Y$ be independent standard normal variables, with joint density $$f_{X,Y}(x,y) = \frac1{2\pi} \exp\left\{-\frac12x^2\right\}\exp\left\{-\frac12 y^2\right\}.$$ We seek the density of $U:=aX+bY$.

Create variables $(U, V)$ by applying the transformation $(x,y)\to(u,v)$ defined by $$ \begin{aligned} u &= ax + by\\ v &= y. \end{aligned} $$ Invert this mapping to obtain $$ \begin{aligned} x &= \frac{u-bv}a\\ y&=v.\\ \end{aligned} $$ Next, compute the Jacobian determinant $$J(u,v) := \operatorname{det}\frac{\partial(x,y)}{\partial(u,v)} =\operatorname{det}\left( \begin{array}{cc} \frac1a&\frac{-b}a\\ 0&1\\ \end{array} \right)=\frac1a $$ so the joint density of $(U, V)$ is $$ \begin{aligned} f_{U,V}(u,v) &= |J(u,v)|\, f_{X,Y}(x(u,v),y(u,v))\\ &=\frac1af_{X,Y}\left(\frac{u-bv}a, v\right)\\ &=\frac1{2a\pi}\exp\left\{-\frac12\left(\frac {u-bv}a\right)^2\right\} \exp\left\{-\frac 12 v^2\right\}\\ &=\frac 1{2a\pi}\exp\left\{-\frac1{2a^2}\left[u^2 - 2buv + (a^2+b^2)v^2\right]\right\}. \end{aligned} $$ Finally compute the marginal density of $U:=aX+bY$ by integrating out $v$, treating $u$ as a constant: $$f_U(u) = \int_{-\infty}^\infty f_{U,V}(u,v)\,dv.$$ Apply the formula $$\int_{-\infty}^\infty \exp\left\{-\left(\frac{Ax^2+Bx+C}D\right)\right\}dx = \sqrt{\frac{D\pi} A}\exp\left\{\frac{B^2-4AC}{4AD}\right\} $$ with $$A:=a^2+b^2,\qquad B:=-2bu,\qquad C:=u^2,\qquad D:= 2a^2$$ and simplify. When the dust settles you have the density of a normal variable with mean $0$ and variance $a^2+b^2$: $$f_U(u)=\frac1{\sqrt{2\pi(a^2+b^2)}}\exp\left\{-\frac{u^2}{2(a^2+b^2)}\right\}. $$


Note 1: To make the Jacobian approach work, it's necessary to introduce the auxiliary variable $v$, which we later integrate out, so that the transformation $(x,y)\to(u,v)$ is invertible.


Note 2: In the multivariate case, you can use induction instead of resorting to the Jacobian. The base case $n=2$ is established above. For the inductive step: if $X_1,\ldots,X_{n+1}$ are iid standard normal, write $$\sum_{i=i}^{n+1} a_iX_i = c W + a_{n+1} X_{n+1}$$ where $c:=\sqrt{\sum_1^n a_i^2}$ and $W:=\frac1c\sum_{i=1}^n a_iX_i$; note that $W$ and $X_{n+1}$ are independent standard normal.

grand_chat
  • 40,909