15

It is well known that if n by n matrix A has n distinct eigenvalues, the eigenvectors form a basis.

Also, if A is symmetric, the same result holds.

Consider

$ A =\left[ {\begin{array}{ccc} 1 & 2 & 3 \\ 0 & 1 & 2 \\ 0 & 0 & 1 \\ \end{array}}\right] $ .

This matrix has single eigenvalue $\lambda=1$, and is not symmetric.

But, the eigenvectors corresponding to $\lambda=1$, ($ v_1 =\left[ {\begin{array}{c} 1 \\ 0 \\ 0 \\ \end{array}}\right] $ , $ v_2 =\left[ {\begin{array}{c} 0 \\ 1/2 \\ 0 \\ \end{array}}\right] $ , $ v_3 =\left[ {\begin{array}{c} 0 \\ -3/8 \\ 1/4 \\ \end{array}}\right] $ ) form a basis.

What sufficient conditions offer the above result?

  • 7
    Non of $;v_2,,v_3;$ is an eigenvector of $;A;$ wrt $;\lambda=1;$ ...In fact, your $;A;$ has only one linearly independent eigenvector wrt to its unique eigenvalue, which can be $;v_1;$ . – DonAntonio May 09 '17 at 09:10
  • This is the definition of diagonalisable. There are many things known about diagonalisability; read your textbook, this question is not asking about anything specific. I am voting to close as "not clear what you are asking". – Marc van Leeuwen May 09 '17 at 11:51

2 Answers2

16

A square matrix is diagonalizable if and only if there exists a basis of eigenvectors. That is, $A$ is diagonalizable if there exists an invertible matrix $P$ such that $P^{-1}AP=D$ where $D$ is a diagonal matrix.

One can show that a matrix is diagonalizable precisely when the dimensions of each eigenspace correspond to the algebraic multiplicity of the corresponding eigenvalue as a root of the characteristic polynomial.

If the dimension of an eigenspace is smaller than the multiplicity, there is a deficiency. The eigenvectors will no longer form a basis (as they are not generating anymore). One can still extend the set of eigenvectors to a basis with so called generalized eigenvectors, reinterpreting the matrix w.r.t. the latter basis one obtains a upper diagonal matrix which only takes non-zero entries on the diagonal and the 'second diagonal'. This is the Jordan normal form which captures the failure of the eigenvectors to form a basis.

2

Let $A$ be the matrix of a linear transformation $\alpha: V\to V$, where $V$ is an $n$-dimensional vector space over the field $\mathbb{F}$.

$V$ has a basis consisting of eigenvectors of $\alpha$ (or $A$ if you prefer) if, and only if, the minimal polynomial $m_\alpha(X)$ of $\alpha$ (or of $A$ if you prefer) is a product of distinct linear factors in $\mathbb{F}[X]$.

The proof is in any decent linear algebra textbook.

ancient mathematician
  • 15,682
  • 2
  • 18
  • 32