2

This is a problem from Prasolov's Geometry:

Given an orthonormal basis $e_1,\dots,e_n$ and a set of vectors $a_1,\dots,a_n$ such that the angle between the vectors $e_i$ and $a_i$ equals $\alpha_i$ for each $i$, prove that if $$\cos \alpha_1+\cdots+\cos \alpha_n>\sqrt{n(n-1)} ,$$ then the vectors $a_1,\dots,a_n$ are linearly independent.

The problems seem more like a linear algebra problem, but I failed to solve it. Any hint please?

Juggler
  • 1,423
  • A first step I would do is, since the length of the vectors does not matter, to assume without loss of generality that all the vectors are unit vectors. Then all the cosines reduce to scalar products. – celtschk Oct 06 '19 at 09:04
  • @celtschk The point has come to me, but I didn't make progress. – Juggler Oct 06 '19 at 10:48

3 Answers3

1

The function $p:\{a_{1},\ldots a_{n}\}\to\sum_{i=1}^{n}\cos\alpha_{i}$, for $\lVert a_{i}\rVert=1$, seems to be a measure of proximity to the given standard orthonormal basis $\{e_{1},\ldots e_{n}\}$: $1\leq p\leq n$.

What is the $\max_{\{a_{1},\ldots a_{n}\}\subset H^{n-1}}p$? (Spoiler: it's $\sqrt{n(n-1)}$). Let's first try to put the hyperplane $H^{n-1}$ "flat" on the $span(e_{1},\ldots e_{n-1})$ with the $e_{n}$ being its unit normal. We're free to choose the best alignment possible for $a_{i}=e_{i}$, $1\leq i\leq n-1$. It doesn't matter in this case but let's choose $a_{n}=\frac{1}{\sqrt{n-1}}\sum_{i=1}^{n-1}e_{i}$ for symmetry reasons. We have achieved a respectable $p_{0}=n-1$.

However, $p$ can be made slightly larger if we tilt the hyperplane "pulling" the unlucky $a_{n}$ (chosen as above) towards $e_{n}$ by an angle $\beta$. For example, in 2D, we'd get a better (the maximum actually) $p_{\beta}=\sqrt{2}>1=p_{0}$ by aligning $a_{1}$and $a_{2}$ along $\beta=45{}^{\circ}$ between $e_{1}$ and $e_{2}$.

Let's do this. It's a simple but cumbersome trigonometric and vector arithmetic, followed by differentiation to find the best optimal angle. I hope someone can write this in a matrix algebraic-way or even fancier.

Again, for symmetry reasons all $\alpha_{i}=\alpha$, $1\leq i\leq n-1$, so we have $$ p_{\beta}=(n-1)\cos\alpha+\sin\beta $$

I will be skipping some details now, adding them back by request possibly with a diagram showing the triangles. The best $\cos\alpha$ we could get from orthogonally projecting now detached $e_{1},\ldots$ back onto the tilted hyperplane to find the best $a_{1},\ldots$: $\langle e_{1},N_{\beta}\rangle=cos(\tfrac{\pi}{2}+\alpha)=-\sin\alpha$. We therefore need to find the unit normal to the hyperplane, $$ N_{\beta}=\frac{e_{n}-\tan\beta\cdot a_{n}}{\sqrt{1+tan^{2}\beta}} $$

After some manipulations we'll get

$$ \cos\alpha=\sqrt{1-\frac{\sin^{2}\beta}{n-1}} $$

Now let $x=\sin\beta$ and we have

$$ p(x)=(n-1)\sqrt{1-\frac{x^{2}}{n-1}}+x $$

Differentiating by $x$ and solving $p^{\prime}(x)=0$ we find the best tilt angle $x=\sin\beta=\sqrt{\frac{n-1}{n}}=\cos\alpha$ (so it turns out all the $n$ angles are in fact equal $\alpha_1=\ldots=\alpha_n$ as they should be) and the largest possible by construction $$ p=\sqrt{n(n-1)}. $$

Any value of $p$ larger than that would have to come from a $\{a_{1},\ldots a_{n}\}$ not constrained to a hyperplane being therefore a linear independent set.

rych
  • 4,445
0

I can prove under the stronger assumption that $\sum \cos\alpha_i > n -\frac{1}{2}\ \ (=\sqrt{n(n-1)+\frac{1}{4}}\ $). The idea is the look at the matrix $A$ with the column unit vectors $a_i$ and notice that $A=I+\Delta$, where the Frobenius norm of $\Delta$ is $<1$ (so the spectral radius of $\Delta $ is $<1$) and that guarantees invertibility.

orangeskid
  • 56,630
  • Thanks for your help. But I fail to get your hint and I'm not familiar with GRAM determinant. Could you please offer more help? – Juggler Oct 06 '19 at 10:47
  • @Juggler: The hint seems not to work. I wonder now whether we can take inspiration from answers to a similar problem https://math.stackexchange.com/questions/2017821/show-linear-independence-of-a-set-of-vectors-close-to-an-orthonormal-basis?rq=1 – orangeskid Oct 06 '19 at 10:59
  • I've done that problem before but the two don't seem to connect. – Juggler Oct 06 '19 at 11:12
  • I wonder how you applied the conditional inequality in the new proof. – Juggler Oct 06 '19 at 23:55
  • @Juggler: It is the idea of second post from here :https://math.stackexchange.com/questions/883130/prove-that-v-1-dots-v-n-is-a-basis-of-v but we are not using that $a_i$ are also unit vectors, so perhaps we are missing something – orangeskid Oct 07 '19 at 00:29
0

I can get somewhat close. Let's make the hypotheses that $$ \sum_{i=1}^n \cos(\alpha_i) \ge \sqrt{n(n-1)} + \frac{1}{n} $$

Observe that the quantity $$ n - \sqrt{n(n-1)} = \frac{1}{1 + \sqrt{\frac{n-1}{n}}} $$ Unfortunately, this is $>\frac{1}{2}$. (It does converge to $\frac{1}{2}$, but this won't help in what follows.) On the other hand, you can check that $$ \frac{1}{1 + \sqrt{\frac{n-1}{n}}} - \frac{1}{n} < \frac{1}{2} $$ We'll just take this as read.

We may assume that the $a_i$ are unit vectors, in which case $$ \cos \alpha_i = \left\langle e_i,a_i\right\rangle $$ The implication of the stronger hypothesis is then that $$ \sum_{i=1}^n \|e_i - a_i\|^2 = 2\left(n - \sum_{i=1}^n \left\langle e_i,a_i\right\rangle\right) < 2\left(n - \sqrt{n(n-1)} - \frac{1}{n}\right) < 1 $$

Now, assume that the $a_i$ are dependent. This means we can find some unit vector $v$ that is orthogonal to all the $a_i$; in particular, we have that $$ \left\langle v,e_i - a_i\right\rangle = \left\langle v,e_i\right\rangle - \left\langle v, a_i\right\rangle = \left\langle v,e_i\right\rangle $$

Putting things together, we have $$ 1 = \|v\|^2 = \sum_{i=1}^n \langle v,e_i\rangle^2 = \sum_{i=1}^n \left\langle v,e_i - a_i\right\rangle^2 \le \sum_{i=1}^n \|v\|^2\|e_i - a_i\|^2 = \sum_{i=1}^n \|e_i - a_i\|^2 < 1 $$ by the work we did above. The resulting contradiction tells us that, in fact, the $a_i$ were independent.

Louis
  • 1,119
  • Nice one. Your proof works for a stronger condition, but I don't think your verification for large enough $n$ with the weaker condition is right. The $\varepsilon$ you introduce here is determined by $n,\alpha_1,\dots,\alpha_n$, so we cannot deal with it as a constant. – Juggler Oct 08 '19 at 23:44
  • Here's a more explicit version of the hypothesis needed for this method. I am not sure whether the idea will go further. – Louis Oct 09 '19 at 09:53