0

Let $A$ be a block upper triangular matrix:

$$A = \begin{bmatrix} A_{1,1}&A_{1,2}\\ 0&A_{2,2} \end{bmatrix}$$

where $A_{1,1} ∈ {\mathbb{R}}^{p \times p}$, $A_{2,2} ∈ {\mathbb{R}}^{(q) \times (q)}$ are known to be Hermtian.

  • In fact, the $A_{1,1}$ and $A_{2,2}$ are 1-D and 2-D discretized Laplacian matrices (with Dirichlet boundary), respectively.
  • They are also known to be diagonalizable and invertible. We have a complete description of their eigenvalues and eigenvectors.

Now, I want to know if the matrix $A$ is diagonalizable.

My attempt: I looked at this math.SE post where the eigenvalues and eigenvectors of matrix $A$ are discussed. My strategy is to develop a set of bases for the $\mathbb{R}^{n\times n}$ (where $n=p+q$) using its eigenvectors. This is a sufficient condition for the diagonalizability of $A$.

  • The eigenvalues of matrix $A$ are simply the (disjoint) union of the eigenvalues of $A_{1,1}$ and $A_{2,2}$

To produce the eigenvectors of $A$, it goes as follow:

Since, $A_{1,1} \; p_1 = \lambda_1 p_1$ with $p_1 \ne 0 $. Thus, we create some of the eigenvectors as follows: $$ \left( \begin{matrix} A_{1,1}&A_{1,2} \\ 0 &A_{2,2} \end{matrix} \right) \left( \begin{matrix} p_1 \\ 0 \end{matrix} \right) = \left( \begin{matrix} A_{1,1} \; p_1 \\ 0 \end{matrix} \right) = \left( \begin{matrix} \lambda_1 p_1 \\ 0 \end{matrix} \right) = \lambda_1 \left( \begin{matrix} p_1 \\ 0 \end{matrix} \right) $$


To find the remaining eigenvectors, the following strategy is suggested:

Suposse now that $\lambda_2$ is eigenvalue of $A_{2,2}$ with eigenvector $p_2$.

Case I: Let's assume $\lambda$ it's not eigenvalue of $A_{1,1}$, hence $|A_{1,1} - \lambda_2 I|\ne 0$. Now

$$\left( \begin{matrix} A_{1,1}&A_{1,2} \\ 0 &A_{2,2} \end{matrix} \right) \left( \begin{matrix} x \\ p_2 \end{matrix} \right) = \left( \begin{matrix} A_{1,1} x + A_{1,2} p_2 \\ \lambda_2 p_2 \end{matrix} \right) $$ We can make $ A_{1,1} x + A_{1,2} p_2 = \lambda_2 x$ by choosing $$x = - (A_{1,1} - \lambda_2 I)^{-1} A_{1,2} \; p_2. $$ , and so we found an eigenvector for $A$ with $\lambda_2$ as eigenvalue.

But, the above selection of parameter $x$ requires $\lambda_2$ shall not be an eigenvalue of $A_{1,1}$. What to do if $A_{1,1}$ and $A_{2,2}$ share some of their eigenvalues?

108_mk
  • 135
  • $A = \begin{pmatrix} 1 & 1\ 0 & 1\end{pmatrix}$ – Exodd May 24 '24 at 12:43
  • @Exodd, Thanks! I am aware of this non-diagonalizable case. Matrix $A_{1,1}$ and $A_{2,2}$ have ''almost'' all their eigenvalues to be DISTINCT except a few. – 108_mk May 24 '24 at 12:48
  • @Exodd, I would like to know if matrix $A_{1,1}$ and $A_{2,2}$ have a common eigenvalue. Does that mean it is not diagonalizable? Thanks! – 108_mk May 24 '24 at 12:50

2 Answers2

1

If $A_{1,1}$ and $A_{2,2}$ have distinct eigenvalues the matrix is straightforwardly diagonalizable since all eigenvalues of $A$ are distinct. If $A_{1,1}$ and $A_{2,2}$ share eigenvalues, it may still be diagonalizable if, for example, the algebraic multiplicity is equal to the geometric multiplicity for all shared eigenvalues.

zinsinho
  • 308
1

Since the two diagonal sub-blocks are diagonalisable (because they are Hermitian), $A$ is diagonalisable if and only if it is similar to $B=\pmatrix{A_{1,1}\\ &A_{2,2}}$. Consequently, by Roth’s removal rule, $A$ is diagonalisable if and only if the matrix equation $A_{1,1}X-XA_{2,2}=A_{1,2}$ is solvable.

user1551
  • 149,263
  • Roth's removal rule is an if and only if? – Exodd May 24 '24 at 19:37
  • 1
    @Exodd Yes. Roth’s removal rule says that two matrices $\pmatrix{A&C\ 0&B}$ and $\pmatrix{A&0\ 0&B}$ (where $A$ and $B$ are square submatrices) over a field $F$ to be similar if and only if the matrix equation $AX-XB=C$ is solvable for $X$ over $F$. E.g. $\pmatrix{1&1\ 0&1}$ is not similar to $\pmatrix{1&0\ 0&1}$ because $(1)(x)-(x)(1)=1$ is not solvable. – user1551 May 25 '24 at 00:07