If $x,y \: \in \:\mathbb{R}^{3}$ are two linearly independent vectors in three dimensional space, and $\langle x,y\rangle$ represents the dot product of the two vectors, then prove that \begin{align*} \begin{vmatrix} \langle x,x \rangle & \langle x,y\rangle \\ \langle y,x \rangle & \langle y,y\rangle \end{vmatrix} >0 \end{align*}
-
1pls see the cauchy schwarz inequality: https://en.wikipedia.org/wiki/Cauchy%E2%80%93Schwarz_inequality#Statement_of_the_inequality and https://math.stackexchange.com/questions/23522/proofs-of-the-cauchy-schwarz-inequality – Rahul Madhavan Apr 12 '21 at 20:15
-
If you let $A=[x \ \ y]$ then perhaps consider $A^TA$? – copper.hat Apr 12 '21 at 20:31
-
See https://en.wikipedia.org/wiki/Gramian_matrix – lhf Apr 12 '21 at 20:48
1 Answers
For two vectors this is equivalent to the Cauchy-Schwarz inequality. For more $k$ vectors, note that replacing them by linear combinations of themselves according to a $k\times k$ matrix $P$ (in coordinates $(xy\ldots z):=(xy\ldots z)P$) changes the Gram matrix (which is what you have before taking the determinant) $M$ into $P^\top MP$, and therefore multiplies its determinant by $\det(P)^2$. In particular if $P$ is upper uni-triangular (each vector only gets modified by a combination of vectors coming before it) then the determinant does not change. This allows performing Gram-Schmidt on the vectors, after which the Gram matrix has become diagonal. Since the diagonal entries are squared norms of the vectors after Gram-Schmidt, the determinant is obviously positive.
Since we are only interested in the sign of the Gram determinant here, we didn't really need to worry about the factor $\det(P)^2$ (as long a $P$ is invertible), and may replace the vectors by any basis whatsoever of the space they span. So we could have done without Gram-Schmidt and just choose any orthonormal basis of the span right away.
The second argument incidentally suggests an easy way to prove the Cauchy-Schwarz inequality.
- 119,547