2

The following content is from Roman Vershynin's High-Dimensional Probability (2018)

Proposition 2.5.2 (Sub-gaussian properties)
Let $X$ be a random variable. Then the following properties are equivalent; the parameters $K_i > 0$ appearing in these properties differ from each other by at most an absolute constant factor:

$(i)$. The tails of $X$ satisfy

$$ \mathbb{P}\{|X| \geq t\} \leq 2 \exp\left(-\frac{t^2}{K_1^2}\right) \quad \text{for all } t \geq 0. $$

$(ii)$. The moments of $X$ satisfy

$$ \|X\|_{L^p} = \left(\mathbb{E}|X|^p\right)^{1/p} \leq K_2 \sqrt{p} \quad \text{for all } p \geq 1. $$

$(iii)$. The MGF of $X^2$ satisfies

$$ \mathbb{E} \exp\left(\lambda^2 X^2\right) \leq \exp\left(K_3^2 \lambda^2\right) \quad \text{for all } \lambda \text{ such that } |\lambda| \leq \frac{1}{K_3}. $$

$(iv)$. The MGF of $X^2$ is bounded at some point, namely

$$ \mathbb{E} \exp\left(\frac{X^2}{K_4^2}\right) \leq 2. $$

Moreover, if $\mathbb{E} X = 0$ then properties (i)–(iv) are also equivalent to the following property.

$(v)$.The MGF of $X$ satisfies

$$ \mathbb{E} \exp\left(\lambda X\right) \leq \exp\left(K_5^2 \lambda^2\right) \quad \text{for all } \lambda \in \mathbb{R}. $$

Definition 2.5.6 (Sub-gaussian random variables)
A random variable $X$ that satisfies one of the equivalent properties $(i)-(iv)$ in Proposition 2.5.2 is called a sub-gaussian random variable. The sub-gaussian norm of $X$, denoted $\|X\|_{\psi_2}$, is defined to be the smallest $K_4$ in property (iv). In other words, we define

$$ \|X\|_{\psi_2} = \inf \left\{ t > 0 : \mathbb{E} \exp\left(\frac{X^2}{t^2}\right) \leq 2 \right\}. $$

Definition 3.4.1 (Sub-gaussian random vectors)
A random vector $X$ in $\mathbb{R}^n$ is called sub-gaussian if the one-dimensional marginals $\langle X, x \rangle$ are sub-gaussian random variables for all $x \in \mathbb{R}^n$. The sub-gaussian norm of $X$ is defined as

$$ \|X\|_{\psi_2} = \sup_{x \in S^{n-1}} \| \langle X, x \rangle \|_{\psi_2}. $$


I know that sub-gaussian random variables have finite sub-Gaussian norms. I want to prove that sub-gaussian random vectors also have finite sub-Gaussian norms.

According to Definition 3.4.1, a random vector $X$ being sub-Gaussian means that for all $x \in \mathbb{R}^n$, the random variable $\langle X, x \rangle$ is a sub-Gaussian random variable. However, this does not directly imply that the sub-Gaussian norms $\| \langle X, x \rangle \|_{\psi_2}$ have a uniform upper bound over $x \in S^{n-1}$.To prove that $\| X \|_{\psi_2}$ is finite, we need to find a constant $K$ such that for all $x \in S^{n-1}$,$\| \langle X, x \rangle \|_{\psi_2} \leq K.$ I've made some attempts, but none of them worked!

Kevin
  • 187

1 Answers1

1

Note that, whenever $X$ is a sub-Gaussian vector :

  1. The map $f$ defined on $S^{n-1}$ by $f:x\mapsto \|\langle X,x\rangle\|_{\Psi_2}$ is continuous as the composition of $\langle X,\cdot\rangle $ with $\|\cdot\|_{\Psi_2}$ (which are both continuous by Cauchy-Schwarz and the triangle inequality, respectively).
  2. The unit sphere $S^{n-1}$ is compact.

Therefore, because continuous functions over compact sets attain their maximum : $$\|X\|_{\psi_2} := \sup_{x \in S^{n-1}} \| \langle X, x \rangle \|_{\psi_2}=\sup_{x \in S^{n-1}} f(x) <\infty, $$ so the sub-Gaussian norm of random vectors is well-defined.