5

I have a matrix $$ A=\begin{bmatrix}6 & 9 &15 \\ -5&-10 & -21 \\2&5&11\end{bmatrix} $$

The Characteristic Polynomial is $ x^3-7x^2+16x-12 $ From this i have worked out the Eigenvalues to be $2,2,3$ and the corresponding Eigenvectors to be

$ \begin{bmatrix} 3\\-3\\1 \end{bmatrix} $ For the value 2, and

$ \begin{bmatrix} 1\\-2\\1 \end{bmatrix} $ For the value 3

However im not too sure what is meant by the Alegrabic multiplicity,Eigenspace and the Geometric multiplicity??

The Question further on asks to compute the $ JordanForm $ which i got to be $$ \begin{bmatrix}3 & 0 &0 \\ 0&2 & 1 \\0&0&2\end{bmatrix} $$

But then asks me to find the Transition matrix P by assembling the generalized eigenvectors. i've computed the generalized eigenvectors to be $ \begin{bmatrix}-6 & -3 &1 \\ 0&1 & -2 \\1&0&1\end{bmatrix} $ but im not sure how to find the Transition Matrix P from this??

Sam
  • 75
  • The algebraic multiplicity of an eigenvalue $\lambda$ is, by definition, the largest integer $k$ such that $(x-\lambda)^k$ divides the characteristic polynomial. In this case the algebratic multiplicity of $2$ is $2$ and the algebraic multiplicity of $3$ is $1$. The geometric multiplicity of $\lambda$ is dimension of the its eigenspace, that is, it is the dimension of ${X\in \Bbb C^{3\times 1}:AX=\lambda X}$. In this case the geometric multiplicity of both $2$ and $3$ is $1$. – Git Gud Feb 19 '13 at 19:01
  • How did you compute the JNF of $A$ without knowing this stuff? I don't get it. – Git Gud Feb 19 '13 at 19:06
  • so in this case, the algebraic multiplicity would be 2? as it is the largest $k$ value? .... i've memorised the method of computing the JNF so i dont exactly understand it, however now im currently attempting to understand it – Sam Feb 19 '13 at 19:13
  • The algebraic multiplicity is relative to each eigenvalue, there isn't an absolute algebraic multiplicity for $A$. The algebraic multiplicity of $2$ is $2$ and $3's$ is $1$. Is this any clearer? – Git Gud Feb 19 '13 at 19:14
  • yes this is clearer, this is the same for the geometric multiplicity? so they'd both be 1?? as in are they relitive to each eigenvector? – Sam Feb 19 '13 at 19:16
  • I'm not $100$% sure you got it, but I think you did. – Git Gud Feb 19 '13 at 19:18
  • i think i understand, the geometric multiplicity is 1 for both values of $\lamda$ as they both produce only one eigenvector each? – Sam Feb 19 '13 at 19:20
  • @Sam Yeah, that's it. Plus: *\lambda – Git Gud Feb 19 '13 at 19:20
  • 1
    I added the JNF to the solution here: http://math.stackexchange.com/questions/308117/how-to-compute-nullspace-on-maple/308210#308210. This is a possible duplicate of that problem as it is worked out. – Amzoti Feb 19 '13 at 19:26
  • @GitGud one last quick question, what is the eigenspace? – Sam Feb 19 '13 at 19:54
  • @Sam The eigenspace for an eigenvalue $\lambda$ is the set I described in my first comment. Again, eigenspaces are relative to each eigenvalue and not absolute to a matrix. – Git Gud Feb 19 '13 at 19:58
  • so in my question, using my eigenvalues $2$ and $3$ what would my eigenspaces be? – Sam Feb 19 '13 at 20:00
  • @Sam $E(2)={X\in \Bbb C^{3\times 1}: AX=2X}$ and, according to your computations (which I didn't check), it follows that $E(2)=<(3,-3,1)>$. The other one is similar. – Git Gud Feb 19 '13 at 20:04
  • ohhh okay i understand it now :) thankyouuuuu :D – Sam Feb 19 '13 at 20:06
  • @GitGud Can you please tell me in which chapters from Hoffman and Kunze are these topics from : Characteristic roots and characteristic vectors of a linear transformation or of a matrix, Algebraic and Geometric multiplicity of a characteristic value, Cayley-Hamilton theorem, Diagonalizable operators and matrices. Minimal polynomial of a linear operator (matrix)....Thank You – Taylor Ted Jul 11 '15 at 16:10
  • @JPG I'm sorry, I don't know that book. Can't help you. – Git Gud Jul 11 '15 at 23:49

1 Answers1

3

There is a nice way of getting everything you ask if you have a factorization of the characteristic polynomial. It uses only the Cayley-Hamilton theorem that if the characteristic polynomial of $A$ is the polynomial $\operatorname{Char}_{A}\left( X\right) $, then $\operatorname{Char}% _{A}\left( A\right) =0$. It also has built in checks of your work. (It helps of course to have a CAS do the arithmetic.) An extension of this to higher dimensions can be used to prove the nature of the Jordan Canonical Form. One minor comment: I write my JCF with the $1$'s below the diagonal.

Here you have $$ A =% \left[\begin{matrix} 6 & 9 & 15\\ -5 & -10 & -21\\ 2 & 5 & 11 \end{matrix}\right] {\mathrm{\ and\ }} \operatorname{Char}_{A}\left( X\right) =X^{3}-7X^{2}+16X-12=\left( X-3\right) \left( X-2\right) ^{2} $$

$$\operatorname{Char}_{A}\left( A\right) =(A-3\mathbf{I})\cdot\left( A-2\mathbf{I}\right) ^{2}= \left[\begin{matrix} 3 & 9 & 15\\ -5 & -13 & -21\\ 2 & 5 & 8 \end{matrix}\right] \cdot% \left[\begin{matrix} 4 & 9 & 15\\ -5 & -12 & -21\\ 2 & 5 & 9 \end{matrix}\right] ^{2} $$

$$= \left[\begin{matrix} 3 & 9 & 15\\ -5 & -13 & -21\\ 2 & 5 & 8 \end{matrix}\right] \left[\begin{matrix} 1 & 3 & 6\\ -2 & -6 & -12\\ 1 & 3 & 6 \end{matrix}\right] = \left[\begin{matrix} 0 & 0 & 0\\ 0 & 0 & 0\\ 0 & 0 & 0 \end{matrix}\right] $$

Now the column space of $\left( A-2\mathbf{I}\right) ^{2}$ is the kernel of $(A-3\mathbf{I})$, so an eigenvector for eigenvalue $3$ is $% \left[\begin{matrix} 1\\ -2\\ 1 \end{matrix}\right] $ and this spans the generalized eigenspace for eigenvalue $3$. Similarly the generalized eigenspace for eigenvalue $2$ is the column space of $(A-3\mathbf{I})= \left[\begin{matrix} 3 & 9 & 15\\ -5 & -13 & -21\\ 2 & 5 & 8 \end{matrix}\right] $. Now look at \begin{align*} \left( A-2\mathbf{I}\right) (A-3\mathbf{I}) & = \left[\begin{matrix} 4 & 9 & 15\\ -5 & -12 & -21\\ 2 & 5 & 9 \end{matrix}\right] \cdot% \left[\begin{matrix} 3 & 9 & 15\\ -5 & -13 & -21\\ 2 & 5 & 8 \end{matrix}\right] = \left[\begin{matrix} -3 & -6 & -9\\ 3 & 6 & 9\\ -1 & -2 & -3 \end{matrix}\right] \neq% \left[\begin{matrix} 0 & 0 & 0\\ 0 & 0 & 0\\ 0 & 0 & 0 \end{matrix}\right] \\ \left( A-2\mathbf{I}\right) \left( A-2\mathbf{I}\right) (A-3\mathbf{I}) & =\left[\begin{matrix} 4 & 9 & 15\\ -5 & -12 & -21\\ 2 & 5 & 9 \end{matrix}\right] \cdot \left[\begin{matrix} -3 & -6 & -9\\ 3 & 6 & 9\\ -1 & -2 & -3 \end{matrix}\right] = \left[\begin{matrix} 0 & 0 & 0\\ 0 & 0 & 0\\ 0 & 0 & 0 \end{matrix}\right] \end{align*}

That is all the arithmetic we have to do, with additional checks built in. For the transition matrix we take column 1 an eigenvector for eigenvalue $3$, column 3 an eigenvector for eigenvalue 2, say $ \left[\begin{matrix} -3\\ 3\\ -1 \end{matrix}\right] \ $ which is the first column of $\left( A-2\mathbf{I}\right) (A-3\mathbf{I} )$ and for column 2 the same (first) column of $(A-3\mathbf{I})$ that is $ \left[\begin{matrix} 3\\ -5\\ 2 \end{matrix}\right] $ to get $$ P= \left[\begin{matrix} 1 & 3 & -3\\ -2 & -5 & 3\\ 1 & 2 & -1 \end{matrix}\right] $$

I check this by my CAS: $P^{-1}AP= \left[\begin{matrix} 3 & 0 & 0\\ 0 & 2 & 0\\ 0 & 1 & 2 \end{matrix}\right] .$

Note that you get the algebraic multiplicity from the characteristic polynomial, the geometric multiplicity and JCF and transition matrix for my JCF without solving any systems of linear equations. If you want your $1$'s above the diagonal, work with rows rather than columns, that is, transpose everything.