2

Let $x_1, x_2, \dots, x_n \in \mathbb{R}^n$ and let $\theta_{ij}$ be the angle between vectors $x_i$ and $x_j$. Solve

$$\max_{x_1, \dots,\, x_n} \min_{1\le i<j\le n} \theta_{ij}$$

That is, find the largest possible value of the minimal angle between every pair of vectors.


My attempt

Wlog. we can assume $\|x_i\|_2 = 1$. Since $0\le \theta_{ij}\le \pi$, consider the following problem $$ \begin{split} \min_{x_1, \ldots,\, x_n} \max_{1\le i<j\le n} \quad & x_i^Tx_j \\ \text{s.t.} \quad & x_i^Tx_i = 1 ,\\ \end{split} $$ which is further equivalent to $$ \begin{split} \min_{x_1, \ldots,\, x_n,\, t} \quad & t \\ \text{s.t.} \quad & t \ge x_i^T x_j, \quad \forall i \ne j \\ & x_i^T x_i = 1. \end{split} $$ I know that the optimal result is actually $x_1$, $\ldots\,$, $x_n$ evenly distributed in any $\mathbb{R}^{n-1}$ subspace, for example,

  • when $n=2$, $x_1 = -x_2$ (evenly distributed in 1-dim subspace), $\theta^* = \pi$;

  • and when $n=3$, $x_1$, $x_2$ and $x_3$ are vertexes of equilateral triangle (evenly distributed in 2-dim subspace), $\theta^* = 2\pi/3$.

Alexander Zhang
  • 473
  • 1
  • 3
  • 9

3 Answers3

7

Your are asking a well-known Spherical Codes Problem:

Place $n$ points on a (unit) sphere in $d$ dimensions so as to maximize the minimal distance (or equivalently the minimal angle) between them.

We can easily obtain a simple lower bound for it as follows. Put $S=\sum_i x_i$. Then

$$0\le (S,S)=\sum_i x_i^2+2\sum_{i<j} (x_i,x_j)=n+2\sum_{i<j}\cos \theta_{ij}.$$

Since $0=n+2{n \choose 2}\frac {(-1)}{n-1}$, there exists $\theta_{ij}$ such that $\cos \theta_{ij}\ge \frac {-1}{n-1}$.

On the other hand, when $x_1,\dots,x_n$ are evenly distributed, that is all $\theta_{ij}$ are equal to $\theta^*$ then we have $\cos \theta^*=\frac {-1}{n-1}$.

We have easily obtained the exact bound because vectors $x_1,\dots, x_n$ can be evenly distributed in the space $\Bbb R^d$ provided $d\ge n-1$. For bigger $n$ the problem is much more complicated.

Alex Ravsky
  • 106,166
0

Not an answer, but a general guess:

Generalizing from your examples, I'd guess that the optimum occurs when the vectors point to the vertices of a regular $n$-simplex. (A 1-simplex is just a segment; a 2-simplex is an equilateral triangle, etc.) That is to say

$$ x_i = e_i - \frac{1}{n}u $$ where $u = e_1 + e_2 + \ldots + e_n$. (Note that these $x_i$ are not unit vectors, but I'll bet you can scale them...)

Certainly that's a "local optimum" in the sense that if you perturb any of the $x_i$ (while holding the others constant), the target value rises.

John Hughes
  • 100,827
  • 4
  • 86
  • 159
0

Let the columns of $\mathrm X \in \mathbb R^{n \times n}$ have unit $2$-norm. Hence, the Gram matrix $\rm Y := X^\top X$ is symmetric, positive semidefinite, and has ones on its main diagonal. The entries off the main diagonal are

$$y_{ij} = \mathrm x_i^\top \mathrm x_j = \| \mathrm x_i \|_2 \, \| \mathrm x_j \|_2 \, \cos(\theta_{ij}) = \cos(\theta_{ij})$$

Maximizing the minimal $\theta_{ij}$ is equivalent to minimizing the maximal entry of matrix $\rm Y$ off the main diagonal. Hence, solving the following semidefinite program (SDP) in $\mathrm Y \in \mathbb R^{n \times n}$ and $t \in \mathbb R$

$$\begin{array}{ll} \text{minimize} & t\\ \text{subject to} & y_{ii} = 1 \quad \forall i \in [n]\\ & y_{ij} \leq t \quad \,\forall i \neq j\\ & \mathrm Y \succeq \mathrm O_n\end{array}$$

we obtain a symmetric, positive semidefinite matrix $\rm \bar Y$ with ones on its main diagonal. Computing a Cholesky decomposition of matrix $\rm \bar Y$, we then obtain a matrix $\rm \bar X$ such that $\rm \bar X^\top \bar X = \bar Y$. The maximal entry of $\rm \bar Y$ off the main diagonal should be the cosine of the desired angle.