5

I need to prove that the only linear transformations $f\colon M_{n\times n}\rightarrow M_{n\times n}$that solve the functional equation in $\mathcal{L}(M_{n\times n})$ $$f(XY)=f(X)Y+Xf(Y)$$ are the commutators. That is, I have to show that for every linear solution $f$ there exists a matrix $A$ such that $f(X)=[A,X]$ for all $X$. Bearing in mind that every commutator satisfies the equation, I have tried to solve this studying the sets $V_A=\{X\in M_{n \times n}\colon f(X)=[A,X]\}$, where $A\in \ker f$, which happen to be subalgebras that are stable under $f$ and include inverses. Specifically, I have been looking for a maximality argument that leads me to conclude that some of these $V_A$ must be all of $M_{n\times n}$. So far, I have not been able to exploit the finite dimension of $M_{n\times n}$. This problem appears as an exercise at the end of an introductory chapter on matrices and finite dimensional vector spaces in Katsumi Nomizu's Fundamentals of Linear Algebra, so little else beyond the rank-nullity theorem is assumed.

user1551
  • 149,263
  • 1
    So $f$ is a derivation of the ring $M_n(K)$. Because the ring is simple, $f$ is an inner derivation, i.e., of the form $f(X)=[A,X]$. – Dietrich Burde Jun 23 '22 at 19:08
  • 3
    @DietrichBurde This is clearly a very useful link, but given the context in the question (specifically "little else beyond the rank-nullity theorem is assumed"), a more elementary solution would be appropriate. I don't think is a duplicate of the linked question. – Theo Bendit Jun 23 '22 at 22:55
  • @TheoBendit I am not sure if such an elementary solution only using little more than rank-nullity is so helpful. It could be very long. I think the natural (and standard) solution is to use either Lie algebra theory or ring theory. In this sense, this is a duplicate for me. But I admit, one may have a different opinion here (or come up with a short and very elementary solution avoiding Lie algebras or simple rings). – Dietrich Burde Jun 24 '22 at 07:53

1 Answers1

3

Here is an elementary proof. Denote the underlying field by $\mathbb F$. Pick two vectors $u,v\in\mathbb F^n$ such that $u^Tv=I_1$, the $1\times1$ identity matrix. Since $f$ is linear, so are the mappings $x\mapsto f(xu^T)v\in\mathbb F^n$ and $y\mapsto u^Tf(vy^T)\in\mathbb F^{1\times n}$. Hence there exist two matrices $A$ and $B$ such that $f(xu^T)v=Ax$ and $u^Tf(vy^T)=y^TB$ for all $x,y\in\mathbb F^n$. It follows that \begin{aligned} f(xy^T) &=f\left(x(u^Tv)y^T\right)\\ &=f\left((xu^T)(vy^T)\right)\\ &=f(xu^T)vy^T+xu^Tf(vy^T)\\ &=Axy^T+xy^TB.\\ \end{aligned} Since $M_n(\mathbb F)$ is spanned by the set of all rank-one matrices, the above gives $f(X)=AX+XB$ for all matrices $X$. However, as $f(I)=f(I^2)=f(I)I+If(I)=2f(I)$, we must have $f(I)=0$. Hence $A+B=0$ and $f(X)=AX+XB=[A,X]$.

user1551
  • 149,263
  • Formally, $u^T v$ is not a scalar, so the equality $f(xy^T)=u^T vf(xy^T)$ is problematic, though not fatal. Anyway, $f(xy^T)=f(x(u^T v)y^T)$ is absolutely transparent. – SEBASTIAN VARGAS LOAIZA Jun 28 '22 at 17:20