Questions tagged [concentration-of-measure]

Use this tag for questions about the principle that a random variable that depends in a Lipschitz way on many independent variables (but not too much on any of them) is essentially constant.

Concentration of measure (about a median) is a principle applied in measure theory, probability and combinatorics and has consequences in other areas such as Banach space theory. Informally, the principle is that a random variable that depends in a Lipschitz way on many independent variables (but not too much on any of them) is essentially constant.

Following are links for learning more.

451 questions
23
votes
1 answer

Concentration of measure vs large deviation

When reading some probability publications I am always not sure why they call this or that inequality a 'concentration inequality' or 'large deviation inequality'. For me these (concentration of measure and large deviation theory) just describe the…
21
votes
2 answers

Tail bounds for maximum of sub-Gaussian random variables

I have a question similar to this one, but am considering sub-Guassian random variables instead of Gaussian. Let $X_1,\ldots,X_n$ be centered $1$-sub-Gaussian random variables (i.e. $\mathbb{E} e^{\lambda X_i} \le e^{\lambda^2 /2}$), not necessarily…
14
votes
2 answers

Lipschitz function of independent sub-Gaussian random variables

If $X\sim \mathcal{N}(0,I)$ is a Gaussian random vector, then Lipschitz functions of $X$ are sub-Gaussian with variance parameter 1 by the Tsirelson-Ibragimov-Sudakov inequality (eg. Theorem 8 here). Suppose $X = (X_1,X_2,\ldots, X_n)$ consisted of…
Hedonist
  • 1,533
  • 13
  • 21
14
votes
2 answers

Concentration of measure bounds for multivariate Gaussian distributions (fixed)

Let $\gamma_n$ denote the standard Gaussian measure on $\mathbb{R}^n$. It is known (see for example Cor 2.3 here: http://www.math.lsa.umich.edu/~barvinok/total710.pdf) that $$\gamma_n\{x\in\mathbb{R}^n: \|x\|^2 \ge \frac{n}{1-\epsilon}\}\ge…
13
votes
2 answers

Can we control the distance between the empirical and theoretical mean on the whole trajectory any better than using Hoeffding and a union bound?

Suppose $X,X_1,X_2,X_3\dots$ is a $\mathbb{P}$-i.i.d. family of $[-1,1]$-valued random variables with $\mathbb{E}[X] = 0$. Hoeffding's inequality implies that \begin{equation*} \forall T \in \mathbb{N}, \forall \varepsilon > 0, \qquad…
12
votes
1 answer

Variance of the Euclidean norm under finite moment assumptions

Let $X = (X_1,X_2 \cdots X_n)$ be random vector in $R^n$ with independent coordinate $X_i$ that satisfy $E[X_i^2]=1$ and $E[X_i^4] \leq K^4$. Then show that $$\operatorname{Var}(\| X\|_2) \leq CK^4$$ where $C$ is a absolute constant and $\| \…
11
votes
1 answer

Expectation of the maximum absolute value of gaussian random variables

Let $\{X_i\}_{i=1}^n$ be an i.i.d. sequence of $\mathcal{N}(0, \sigma^2)$ variables, and consider the random variable $$Z_n : = \max_{i=1,\ldots,n}|X_i|.$$ I need to prove the bound $$ E[Z_n] \leq \sqrt{2\sigma^{2}\log{n}} + \frac{4…
10
votes
1 answer

More approximately orthogonal vectors than the dimension of the space

It is impossible to find $n+1$ mutually orthogonal unit vectors in $\mathbb{R}^n$. However, a simple geometric argument shows that the central angle between any two legs of a simplex goes as $\theta = \mathrm{arccos}(-1/n)$. This approaches $90$…
10
votes
1 answer

Mean concentration implies median concentration

Exercise 2.14 in Wainwright, "High-Dimensional Statistics", states that if $X$ is such that $$P[|X-\mathbb{E}[X]|\geq t] \leq c_1 e^{-c_2t^2},$$ for $c_1, c_2$ positive constants, $t\geq 0$, then for any median $m_X$ it holds that $$P[|X-m_X|\geq t]…
Lanc3
  • 321
10
votes
2 answers

Sharper Lower Bounds for Binomial/Chernoff Tails

The Wikipedia page for the Binomial Distribution states the following lower bound, which I suppose can also be generalized as a general Chernoff lower bound. $$\Pr(X \le k) \geq \frac{1}{(n+1)^2}…
9
votes
3 answers

Dudley's Inequality can be Loose (Vershynin 8.1.12)

Let $e_1,...,e_n$ denote the canonical basis vectors in $\mathbb{R}^n$. Consider the set $$T = \left \{ \frac{e_k}{\sqrt{1 + \log k}}, k =1,...,n\right \}$$ Show that $$\int_{0}^\infty \sqrt{\log \mathcal{N}(T,d,\epsilon)} d\epsilon…
8
votes
0 answers

Concentration of measure on spheres with respect to a unitary of trace approximately zero

This question arose out of my attempt to understand how a unitary of trace approximately zero acts on the unit sphere of a $n$-dimensional Hilbert space. First, some context: We note that, by concentration of measures for spheres, we have the…
8
votes
1 answer

Median Concentration implies mean concentrarion

I want to prove that if X is such that $$P[|X-m_X|\geq t] \leq c_1 e^{-c_2t^2},$$ for $c_1, c_2$ positive constants, $t\geq 0$, then it holds that $$P[|X-E[X]|\geq t] \leq c_3 e^{-c_4t^2},$$ with $c_3=1+2c_1$ and $c_4=c_2/4$. There is a proof of…
8
votes
2 answers

Relation between Gaussian width and its squared version

I'm currently reading through Roman Vershynin's High Dimensional Probability and working through one of the exercises (7.6.1). Consider a set $T \subseteq \mathbf{R}^n$ and define its Gaussian width $w(T)$, as $$ w(T) := \mathbb{E} \sup_{x \in T}…
8
votes
0 answers

Getting a bound for Gibbs distribution mean

Suppose $F$ is a strictly convex and increasing function, $U$ a random variable with support $[0,1]$ and density $$ f_U(u)= \frac{e^{-\frac{1}{T}F(u)}}{\int_{0}^{1} e^{-\frac{1}{T} F(x)} dx}.$$ Do we have a known bound for $\Pr\{U>y\}$ for any $y>0$…
1
2 3
30 31