0

Let $a\in\mathbb{R}^N$ and $X_1,...,X_N$ be independent random variables with zero mean and unit variance. Im trying to prove that:

$$\|a\|_{2}\le{\|\sum_{i=1}^{N}{a_iX_i}}\|_{L^p},$$ with $p\in{[2,\infty)}$

Ive tried playing with definitions but haven't gotten anywhere.

Any hints?

kam
  • 1,366
  • I don't think you need subgaussianity at all; since $X_i$ are unit variance and independent, the LHS is just the $L_2$ norm of the sum you have on the RHS. Then, the result follows from Jensen's. – E-A Feb 07 '20 at 00:15
  • But surely applying the $L_2$ norm to the RHS and using Jensens shows that the RHS is greater than or equal to 0? I still cannot see how this works? – kam Feb 07 '20 at 00:24

1 Answers1

1

Let $Y=\displaystyle \sum_{i=1}^N a_i X_i$. By the hypothesis($X_i$ 's independent, with mean zero and unit variance) we can easily find that $||Y||_{L^2}:= (E|Y| ^2)^{1/2}=(\displaystyle \sum_{i=1}^N a_i^2)^{1/2}=||a||_{L^2}$. Since $p\geq 2$ it follows from Lyapunov inequality. See Lyapunov's inequality in Probability

sakas
  • 120
  • I don't understand why the equality of $(E|Y| ^2)^{1/2}=(\displaystyle \sum_{i=1}^N a_i^2)^{1/2}$ holds – kam Feb 07 '20 at 00:52
  • As I say in the answer $X_i's$ are independent with zero mean and unit variance so if expand the sum and take expectation the covariances disappear and the variances are just one. Do it for N=2 and you will see it imediately – sakas Feb 07 '20 at 00:58
  • Just to clarify, your answer assumes $X_i\in{}\mathbb{R}$? – kam Feb 07 '20 at 01:02
  • yes $X_i(\omega)\in \mathbb{R}$ – sakas Feb 07 '20 at 01:12
  • as E-A told above, it has nothing to do with sub-gaussian. This is an intermediate inequality in order to prove Khintchine's ineaquality.You need the sub-gaussian property to prove Khintchine's inequality. – sakas Feb 07 '20 at 01:19