2

For all $\mathbf{x}$, we see that $\mathbf{A}^n\mathbf{x}=\mathbf{0}$. Therefore we can construct a linearly independent basis of eigenvectors (each with eigenvalue $0$). Following from this, the eigenvectors of $\mathbf{A}$ must also form a basis, with eigenvalues $0^{1/n}=0$. Since a square matrix is diagonalizable iff there is a basis of its eigenvectors, there exists a frame in where $\mathbf{A} = \mathrm{diag}(0,\cdots,0) = \mathbf{0}$. However $\mathbf{0}\mathbf{M}=\mathbf{0}$ for all matrices $\mathbf{M}$ so therefore $\mathbf{A} = \mathbf{0}$ in any frame.

Is this proof complete? Where have I used the fact that $\boldsymbol{A}$ is real symmetric? The question actually asks you to consider the quadratic form $Q = \mathbf{x^T}\mathbf{A}\mathbf{x}$ but how does this help? Are there any other proofs?

  • 2
    You used that $\mathbf A$ is symmetric in the first argument ("therefore we can..."), this is the spectral theorem. – nicomezi Apr 09 '22 at 08:42
  • @nicomezi but does that not follow from the fact that $\mathbf{A}^n\mathbf{x}=\mathbf{0}$ for any vector $\mathbf{x}$ - which doesn't make a reference to $\mathbf{A}$ being symmetric? – user246795 Apr 09 '22 at 08:46
  • 3
    No, because in general matrices are not diagonalizable. Basically your proof is saying: any nilpotent matrix which is diagonalizable is zero, and any real symmetric matrix is diagonalizable, therefore any real symmetric nilpotent matrix is zero. – Captain Lama Apr 09 '22 at 08:48
  • 2
    You never explained why the eigenvectors of $A^n$ are the same of those of $A$ – Exodd Apr 09 '22 at 08:57
  • 3
    A real entries symmetric matrix is always diagonalizable... – Jean Marie Apr 09 '22 at 08:58
  • 1
    The statement as it stands is false. Consider $A=\pmatrix{1&i\ i&-1}$ for instance. It is true over $\mathbb R$, however, and we can actually prove a stronger statement. – user1551 Apr 09 '22 at 08:58
  • 2
    I should also point out that as such, the statement is just false, you need to assume that the matrix has real coefficients. You don't even have to work with weird fields, the matrix $\begin{pmatrix} i & 1 \ 1 & -i \end{pmatrix}$ is already a counter-example over $\mathbb{C}$. – Captain Lama Apr 09 '22 at 08:59
  • Seems like everyone had the same idea at the same time. – Captain Lama Apr 09 '22 at 09:00
  • @user1551 I have edited the question - thanks for pointing out the mistake! – user246795 Apr 09 '22 at 09:02
  • As for considering the associated quadratic form, I have no idea what the person who wrote the exercise was thinking about, and I suspect this is a mistake, because the matrix representation of bilinear forms has very little to do with matrix products, and I really don't see how you're going to deduce anything from the fact that $A$ is nilpotent. – Captain Lama Apr 09 '22 at 09:15

2 Answers2

2

By induction. Let $n>1$ then for any $x, y \in V$ $$0 = \langle A^n x, y \rangle = \langle A^{n-1} x, A y \rangle.$$ This shows that $A^{n-1}V$ is orthogonal to $A V$ and in particular, since $A^{n-1}V \subseteq AV$, $A^{n-1}V$ is orthogonal to itself. Therefore $A^{n-1} = 0$.

WimC
  • 33,414
  • 2
  • 52
  • 98
0

Notice that $\text{tr}(A^TA)=0$ implies $A=0$. Since $A$ is symmetric, then it follows from $A^2=0$ that $A=0$.

Moreover, if $A$ is symmetric and $A^n=0$, there exists $k\in\mathbb{N}$ such that $2^k\ge n$ and then $A^{2^k}=A^{n}A^{2^k-n}=0$. By induction we obtain $$A^{2^k}=0\Longrightarrow A^{2^{k-1}}=0\Longrightarrow\cdots\Longrightarrow A=0.$$