8

Let $B$ be the $n \times n$ square matrix; $\lambda_1, \lambda_2, \dots, \lambda_n$ are its pairwise distinct eigenvalues. For all $n \times n$ matrix $A$ let me define $F(A) = AB + BA$. We can consider $F$ as a linear operator, because $F(\alpha X + \beta Y) = \alpha F(x) + \beta F(y)$.

What eigenvalues does $F$ have?

Any help would be appreciated.

Swistack
  • 802
  • I've tried to find eigenvectors solving equation $F(x) = XB + BX = \lambda X \Rightarrow XB = (\lambda I - B) X$, however, it didn't help much. Also I've noticed that B is a diagonalizable matrix since it has pairwise distinct eigenvalues, but how can I use it? Finally, I composed the $n^2 \times n^2$ matrix, corresponding to this linear operator, it was useless as well. – Swistack Jul 01 '15 at 11:00
  • Your issue is linked to "Kronecker sum" of matrices and "Sylvester equation". See the reference given in the answer to (https://mathoverflow.net/questions/219471/additive-version-of-kronecker-product) – Jean Marie Jan 11 '18 at 06:58

3 Answers3

6

Let $e_i$ be the eigenvectors of $B$, i.e. $$Be_i=\lambda_ie_i$$ and let $E_{ij}=e_i e_j^T$ be the elementary matrices in this basis, i.e. $$E_{ij}e_k=\delta_{jk}e_i.$$ As it turns out, $F$ is already diagonal in the $E_{ij}$-basis: \begin{align} &BE_{ij}e_k=\delta_{jk}Be_i=\lambda_i\delta_{jk}e_i=\lambda_iE_{ij}e_k\\ \Rightarrow&F(E_{ij})e_k=BE_{ij}e_k+E_{ij}Be_k=(\lambda_i+\lambda_j)E_{ij}e_k \end{align} The eigenvalues are therefore $\lambda_i+\lambda_j$. (Some of them may coincide, for example if you apply the permutation $i\leftrightarrow{}j$.)

user251257
  • 9,417
himbrom
  • 61
6

In this solution, we only assume that $B$ is diagonalizable (i.e., the eigenvalues $\lambda_i$'s need not be distinct). If $v_1,v_2,\ldots,v_n$ are the eigenvectors of $B$ and $w_1,w_2,\ldots,w_n$ are the left eigenvectors of $B$, where $Bv_i=\lambda_iv_i$ and $w_i^\top B=\lambda_i w_i^\top$ for $i=1,2,\ldots,n$. Then, $$ \begin{align} F\left(v_i w_j^\top\right)&= \left(v_iw_j^\top\right)B+B\left(v_i w_j^\top\right)=v_i\left(w_j^\top B\right)+\left(Bv_i\right)w_j^\top \\ &=v_i\left(\lambda_j w_j^\top\right)+\left(\lambda_iv_i\right)w_j^\top=\left(\lambda_i+\lambda_j\right)v_iw_j^\top\,. \end{align}$$ Since the $n^2$ matrices $v_iw_j^\top$, where $i,j=1,2,\ldots,n$, are linearly independent, we have found all eigenvectors of $F$.

EDIT (Due to Request):

We shall prove that the matrices $v_iw_j^\top$, for $i,j=1,2,\ldots,n$, are linearly independent. Let $K$ be the base field. Suppose that there exist $\kappa_{i,j}\in K$ for $i,j=1,2,\ldots,n$ such that $\sum_{i=1}^n\sum_{j=1}^n\kappa_{i,j}v_iw_j^\top=\boldsymbol{0}_{n\times n}$. Write $w_j=\left(w_j^1,w_j^2,\ldots,w_j^n\right)$ for $j=1,2,\ldots,n$. Hence, $v_iw_j^\top=\begin{bmatrix} w_j^1v_i&w_j^2v_i&\cdots&w_j^nv_i\end{bmatrix}$. Therefore, $\sum_{i=1}\sum_{j=1}^n\kappa_{i,j}v_iw_j^\top=\boldsymbol{0}_{n\times n}$ implies that $$\begin{bmatrix}\displaystyle\sum_{i=1}^n\left(\sum_{j=1}^n\kappa_{i,j}w_j^1\right)v_i & \displaystyle\sum_{i=1}^n\left(\sum_{j=1}^n\kappa_{i,j}w_j^2\right)v_i & \cdots & \displaystyle\sum_{i=1}^n\left(\sum_{j=1}^n\kappa_{i,j}w_j^n\right)v_i \end{bmatrix}=\boldsymbol{0}_{n\times n}\,.$$ Consequently, for $i=1,2,\ldots,n$ and $k=1,2,\ldots,n$, we must have $\sum_{j=1}^n\kappa_{i,j}w_j^k=0$, since the $v_i$'s are linearly independent. That is, $\sum_{j=1}^n\kappa_{i,j}w_j=\boldsymbol{0}_{n\times 1}$ for $i=1,2,\ldots,n$. As the vectors $w_j$'s are linearly independent, $\kappa_{i,j}=0$ for all $i,j=1,2,\ldots,n$, and the result follows immediately.

P.S.:

(1) I think my solution is identical to himbrom's.

(2) This solution works similarly if $F$ is defined via $F(A)=\alpha AB+\beta BA$ for every matrix $A\in\text{Mat}_{n\times n}(K)$, where $\alpha,\beta \in K$ are nonzero. For $i,j=1,2,\ldots,n$, the matrix $v_iw_j^\top$ is still an eigenvector of $F$, but with the eigenvalue $\alpha \lambda_j+\beta \lambda_i$.

(3) It would be an interesting problem to see if the converse holds. Suppose, for fixed $\alpha,\beta \in K \setminus\{0\}$ and for a fixed matrix $B\in\text{Mat}_{n\times n}(K)$, that $F(A)=\alpha AB+\beta BA$ for every matrix $A\in\text{Mat}_{n\times n}(K)$. If $F$ is a diagonalizable linear operator, then does it follow that $B$ a diagonalizable matrix? Does the answer depend on $K$, $\alpha$, and/or $\beta$? For example, in the case where $K$ is algebraically closed of characteristic $0$, $\alpha=1$, and $\beta=-1$, diagonalizability of $F$ is equivalent to that of $B$ (this is a well known result in Lie Algebra). If $\alpha=0$ or $\beta=0$, but not both of them are zero, then $F$ is diagonalizable if and only if $B$ is so.

Batominovski
  • 50,341
  • Thanks! Could you explain please, why are all $n^2$ matricies $v_iw_j^T$ linearly independent? – Swistack Jul 01 '15 at 12:29
  • a somewhat easier way to see the linear independence: write $B=VDV^{-1}$; then right eigenvectors are (chosen to be) columns of $V$ and left eigenvectors are (chosen to be) rows of $V^{-1}$ since $V^{-1}B=DV^{-1}$. Then $T:M_n(\mathbb F)\rightarrow M_n(\mathbb F)$ given by $T(X) =V^{-1}XV$ and $T\Big(\big{v_i w_j^T\big}\Big)=\big{e_i e_j^T\big}$, a complete set of standard basis vectors for $M_n(\mathbb F)$ – user8675309 Jan 05 '25 at 18:31
1

Consider a more general case. Let $P\in\mathbb F^{r\times r}$ and $Q\in\mathbb F^{s\times s}$ be two matrices with full spectra in $\mathbb F$. By Jordan-Chavalley decomposition, we may write $P=D_1+N_1$ and $Q=D_2+N_2$ where $D_1,D_2$ are diagonalisable, $N_1,N_2$ are nilpotent, $D_1$ commutes with $N_1$, and $D_2$ commutes with $N_2$. It follows that the four linear maps $$ d_1(X)=D_1X,\quad n_1(X)=N_1X,\quad d_2(X)=XD_2,\quad n_2(X)=XN_2 $$ commute. In turn, $d:=d_1+d_2$ commutes with $n:=n_1+n_2$.

Note that $d(X)=D_1X+XD_2$ is diagonalisable. Since $D_1$ and $D_2$ are diagonalisable, $D_1$ has a right-eigenbasis $\{u_1,\ldots,u_r\}$ such that $D_1u_i=\lambda_iu_i$ for each $i$, and $D_2$ has a left-eigenbasis $\{v_1,\ldots,v_s\}$ such that $v_j^TD_2=\mu_j v_j^TD_2$ for each $j$. The rank-one matrices $u_iv_j^T$ thus form an eigenbasis of $d$, with $$ d(u_iv_j^T)=(\lambda_i+\mu_j)u_iv_j^T. $$

The linear map $n$ is nilpotent: since $N_1$ and $N_2$ are nilpotent, so are $n_1$ and $n_2$. As $n_1$ and $n_2$ commute, $n=n_1+n_2$ is nilpotent.

It follows that $d+n$ is a Jordan-Chavalley decomposition of $f(X)=PX+XQ$. The eigenvalues of $f$ are therefore those of $d$, i.e. $\lambda_i+\mu_j$ for $(i,j)\in\{1,2,\ldots,r\}\times\{1,2,\ldots,s\}$.

user1551
  • 149,263