2

Let $n$ be a positive integer. Let $A$ be an $n\times n$ matrix with real entries. Let $W$ be a subspace of $\mathbb{R}^n$ such that for every $\textbf{w}$ in $W$, $A\mathbf{w}$ is also in $W$. Suppose $A$ is diagonalisable and has exactly three distinct eigenvalues $\lambda_1,\lambda_2,$ and $\lambda_3$. Show that $W$ has a basis consisting of eigenvectors of $A$.

My thought:

So, it is clear that the corresponding eigenvectors $\mathbf{v}_1,\mathbf{v}_2$ and $\mathbf{v}_3$ are linearly independent.

On the other hand, we know that $A$ has a rank 3.

If we can show that $\dim W=\text{rank} A=3$, and the three eigenvectors belong to $W$, then the problem can be solved flawlessly.

My problem:

  1. The three eigenvectors are not necessarily belong to $W$. Say, $W$ can be simply $\{\textbf{0}\}$!
  2. How is the dimension of $W$ and the rank of $A$ interrelated?

Perhaps my progress is totally off course. I am nearly clueless on this question.

Thanks for any answers in advance.

  • 1
    Having exactly three eigenvalues does not mean $n=3$, so you cannot conclude rank $A=3$, on the other hand, those eigenvalues can be $0$, which means it cannot be full rank also! – Angae MT Aug 14 '24 at 06:45
  • 1
    See https://math.stackexchange.com/questions/62338/diagonalizable-transformation-restricted-to-an-invariant-subspace-is-diagonaliza/78090#78090 –  Aug 14 '24 at 06:48
  • @hft1 Sorry but I can’t see how is that theorem related. Could you please elaborate a bit? – kotori061025 Aug 14 '24 at 06:56
  • 1
    @kotori061025 Essentially, you show that $A$ remains diagonalizable when restricting to an invariant subspace. The existence of an eigenbasis is equivalent to being diagonalizable. –  Aug 14 '24 at 07:00
  • @hff1 Now I see it. My question is somehow duplicated that question. Thanks for telling me that. – kotori061025 Aug 14 '24 at 07:11
  • @Zuka it’s not important – Chris Aug 16 '24 at 18:51

2 Answers2

1
  1. It is not beacause $A$ has exactly $3$ eigenvalues that $A$ does necessary have $\text{rank}(A) = 3$. Consider as a counterexample : \begin{equation} A= \begin{pmatrix} \lambda_1 & 0 & 0 & 0\\ 0 & \lambda_2 & 0 & 0\\ 0 & 0 & \lambda_3 & 0\\ 0 & 0 & 0 & \lambda_3 \\ \end{pmatrix} \end{equation} Then $A$ has exactly $3$ eigenvalues and $\text{rank}(A) = 4$.

  2. The dimension of $W$ and the rank of $A$ are not interrelated. With diagonal matrix it is easy to construct for any $d,r \leq n$, $A \in \mathcal{M}(\mathbb{R}^n)$ such that $\text{rank}(A) = r$ and $\text{dim}(W) = d$.

  3. To solve your problem you have to consider the restriction of $A$ to $W$ as an endomorphism on $W$ ($A \in \mathcal{L}(W)$) and to show that this endomorphism is diagonalisable.

0

Once we show that $\sigma(A|_W)\subseteq \sigma(A)$, your desired result follows easily as: $$A(w_{\lambda}) = A|_W(w_{\lambda}) = \lambda w_{\lambda}$$ shows eigenvectors for $A|_W$ are eigenvectors for $A$.


The following was inspired by Arturo's proof, as suggested by hff1 above.

Since we are in $\mathbb{R}^n$, we may write $V = W\oplus W^{\perp}$ with a joint basis $B$ making the matrix representation block diagonal (applying "$AW\subseteq W$"): $$[A]_B = \begin{bmatrix}A_W && *\\ 0 && A_{W^{\perp}}\end{bmatrix}.$$ From here, it is clear that the characteristic polynomial for $A|_W$: $$P_W(x) := det(A_W - xI_k)$$ divides that of $A$'s (since $[A]_B$ is block triangular, it is just a product of the two determinants -- I labeled $A_{W^{\perp}}$ the lower block, but we don't know if the image of the rest of the basis vectors stays in $W^{\perp}$ yet, so there is a $*$ for the remaining block).

Hence we have the spectral containment.

I suppose we require $A$ to be diagonalizable to ensure the basis doesn't need any generalized eigenvectors that might change the proof. $\square$

I Zuka I
  • 1,238