1

I want to diagonalize an arbitrary quadratic form $$\sum_{i\leq j \leq n} a_{i,j}x_ix_j$$ over an algebraically closed field of characteristic not equal to $2$. If $x_n^2$ appears in the form with nonzero coefficient, we can "complete the square" to get the square of a linear form plus a quadratic form of dimension $n-1$. If all $x_i^2$'s appear with nonzero coefficient, doing this successively will diagonalize the form and we'll end up with a total of $n$ squares of linear forms. However if $x_i^2$ does not appear, it seems we have to write $$x_i \sum_{k<i}a_{k,i}x_{k} = \bigg(\frac{1}{2}(x_i +\sum_{k<i}a_{k,i}x_{k})\bigg)^2 + \bigg(\frac{i}{2}(x_i -\sum_{k<i}a_{k,i}x_{k})\bigg)^2. $$

which is fine except that it requires two linear forms instead of one. This seems to throw the dimension off so that we end up with more than $n$ squares of linear forms at the end. How do you fix this?

Context: This is exercise 5.4.J in Ravi Vakil's Fundamentals of Algebraic Geometry

bart
  • 33
  • Isn't this problem equivalent to showing that any symmetric matrix is orthogonally diagonalizable? Every proof of the latter fact involves eigenvalues, so I don't see how we're going to show the desired result just by completing the square. The good news is that if you get stuck showing this result by completing the square, you can just imitate a proof of the orthogonal diagonalizability of symmetric matrices. – Charles Hudgins Apr 25 '19 at 09:02
  • @CharlesHudgins I don't think "orthogonally diagonalizable" has a meaning here since i'm working over an arbitrary algebraically closed field (char $\neq 2$). – bart Apr 25 '19 at 09:08
  • notice that in case the field is $\mathbb{C}$, this is therefore a weaker claim than the one you allude to (that hermitian matrices are unitarily diagonalizable) – bart Apr 25 '19 at 09:15
  • That's a good point, which sent me googling to see if the more general statement is also true. Look at this stack exchange post https://math.stackexchange.com/questions/1142250/diagonalization-of-a-symmetric-matrix-over-algebraically-closed-field . – Charles Hudgins Apr 25 '19 at 09:29
  • @CharlesHudgins Huh, this is strange: https://mathoverflow.net/questions/23629/non-diagonalizable-complex-symmetric-matrix

    am I missing something or are these contradictory?

    – bart Apr 25 '19 at 09:35
  • I don't think there's a contradiction, but I'm not sure how the two results are consistent either. Sorry if my comments have only confused things. It's been a while since I've looked at this stuff. I'm reading around right now. I'll let you know if I figure it out. – Charles Hudgins Apr 25 '19 at 10:39
  • @CharlesHudgins not at all -- i appreciate your help – bart Apr 25 '19 at 10:46
  • 1
    It looks like the "contradiction" arises because we're being imprecise about what we mean by "diagonalization." For example, in the answer you linked, there is no invertible matrix $A$ such that $A^{-1} \begin{pmatrix} 1 & i \ i & -1\end{pmatrix} A$ is diagonal. But there is an invertible matrix $A = \begin{pmatrix}1 & -i \ 1 & i \end{pmatrix}$ such that $A^T \begin{pmatrix} 1 & i \ i & -1 \end{pmatrix} A$ is diagonal. You can check that for the purposes of diagonalizing a quadratic form, it is the second notion of diagonalizability of a matrix that we need. – Charles Hudgins Apr 25 '19 at 11:04
  • Continued: It is this second notion of diagonalizability that is used in the answer I linked. – Charles Hudgins Apr 25 '19 at 11:07
  • @CharlesHudgins Forgetting about the issue of "orthonormality" in the answer you linked, the claim there is that there is a basis of eigenvectors for any symmetric matrix $M$ over $\mathbb{C}$. Wouldn't that imply there exists $A$ with $A^{-1}MA$ diagonal? – bart Apr 25 '19 at 11:11
  • or did the answer in that post prove a different claim from what was asked by the OP? – bart Apr 25 '19 at 11:13
  • In the answer I linked, it looks like the post proved that, given a quadratic form with matrix $Q$, we can find an invertible matrix $A$ such that $A^T D A = Q$ where $D$ is diagonal (apologies: I wrote this condition backward in the previous comment). Note that we do not require that $A^T = A^{-1}$. In the original basis, say $e_1, \ldots e_n$, the quadratic form is $q(x) = x^T Q x$. Define a new basis $v_i = A e_i$. Then $q(x) = x^T Q x = x^T A^T D A x = (Ax)^T D (Ax)$, where the $Ax$ are linear combinations of the $x_i$ that make the quadratic form diagonal. – Charles Hudgins Apr 25 '19 at 11:35
  • I found a good reference that brings together everything we've been talking about if you just want an explanation of the whole problem: http://www2.math.ou.edu/~kmartin/ntii/chap7.pdf – Charles Hudgins Apr 25 '19 at 11:44
  • @CharlesHudgins great, thanks so much. – bart Apr 25 '19 at 11:48
  • 1
    @CharlesHudgins see algorithm at http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr – Will Jagy Apr 25 '19 at 17:18
  • @WillJagy Thanks Will, your step 2 case III was the answer to my question. – bart Apr 26 '19 at 00:38

1 Answers1

1

Following @WillJagy 's link in the comments, if no squared terms appear we take some nonzero term, $cxy~~(c\in K)$, say, and replace all $y$ with $x+y$. Then $x^2$ appears in the new expression and we complete the square and continue.

bart
  • 33