2

How can I show that the minimal polynomial of a diagonal matrix is the product of the distinct linear factors $(A-\lambda_{j}I)$? In particular, if we have a repeated eigenvalue, why is it that we only count the factor associated with that eigenvalue once?

I know by the Cayley Hamilton theorem that the characteristic polynomial $p(t)$, i.e. the product of all the linear factors, not necessarily distinct, yields $p(A) = 0$. But I'm uncertain how this can be simplified for diagonal matrices when there is a repeated eigenvalue.

asdfghjkl
  • 1,405
  • Maybe the example of a scalar matrix $A = \lambda{}I$ will enlighten you : what would $A-\lambda{}I$ be ? – Traklon Jan 30 '14 at 08:50
  • Hint: when a polynomial has multiple roots, after removing the repeated factors, the roots remain the same. E.g. any root of $(x-1)^3(x+3)$ is a root of $(x-1)(x+3)$. –  Jun 24 '15 at 07:01

2 Answers2

2

To start with the minimal polynomial is the "Polynomial of the smallest degree" which satisfies the relation P(M)=0, where M is your matrix. In case of the diagonal matrix the polynomial consisting of the factors (x-a) where a are diagonal entries (eigenvalues) these need to be written only once and the relation P(M)=0 is satisfied (proof is trivial) .

0

HINT:

$$\begin{pmatrix}D_m&0&0\\0&\color{red}0&0\\0&0&D_{n-m-1}\end{pmatrix}\cdot D'_n=\begin{pmatrix}D''_m&0&0\\0&\color{red}0&0\\0&0&D''_{n-m-1}\end{pmatrix}$$ Where $D_i$ are any diagonal matrices of dimension $i$

b00n heT
  • 17,044
  • 1
  • 39
  • 52
  • Since the eigenvalues of a diagonal matrix are the elements of the diagonal, you have that for any eigenvalue $\lambda$, $A-\lambda I$ will set to zero exactly all diagonal terms $a_{ii}$with are equal to $\lambda$. Now you multiply $A-\lambda I$ by any diagonal matrix $B$ and you will still have that the $c_{ii}$ term of $(A-\lambda I)B$ will be equal to zero, hence doing this once for all eigenvalues leads to the null-matrix – b00n heT Jan 30 '14 at 09:30