4

Consider two (Hermitian) matrices $A$ and $B$. Is there a nice expression for the following?

$$ \boxed{ \frac{\mathrm d}{\mathrm d x} \exp\left( A + x B \right)\big|_{x=0} = \; ? }$$

Of course, if $A$ and $B$ commute, this is simply $B \exp{(A)}$.

One thing I tried was the Suzuki-Trotter formula: \begin{align} \boxed{\frac{\mathrm d}{\mathrm d x} \exp\left( A + x B \right)\big|_{x=0}} &= \frac{\mathrm d}{\mathrm d x} \left. \left( \lim_{N \to \infty} \left[ \exp\left( \frac{A}{N} \right) \exp \left( x \frac{B}{N} \right) \right]^N \right) \right|_{x=0} \\ &= \lim_{N\to \infty} \sum_{n=1}^N \exp\left( \frac{n}{N} A \right) \frac{B}{N} \exp\left( \frac{N-n}{N} A \right) \\ &= \left( \lim_{N \to \infty} \frac{1}{N} \sum_{n=1}^N e^{\frac{n}{N}A }B\; e^{-\frac{n}{N}A } \right) e^A \\\ &= \boxed{ \int_0^1 e^{t A} B \;e^{(1-t)A} \; \mathrm d t } \; . \end{align} Is this as close as it gets to a closed form?

One thing we can do is go to the eigenbasis of $A$, such that we can explicitly perform the integration over $t$. If we index the eigenvectors of $A$ by $i$, with corresponding eigenvalues $\lambda_i$, then we can express the answer in this basis: \begin{equation} \boxed{ \left( \frac{\mathrm d}{\mathrm d x} \exp\left( A + x B \right)\big|_{x=0} \right)_{ij} = \frac{e^{\lambda_i}-e^{\lambda_j}}{\lambda_i-\lambda_j} B_{ij}} \;, \end{equation} where $(\cdot)_{ij}$ are the entries of a matrix in the eigenbasis of $A$. (Note that if $\lambda_i = \lambda_j$, we replace $\frac{e^{\lambda_i}-e^{\lambda_j}}{\lambda_i-\lambda_j} \to e^{\lambda_i}$, which is also consistent with l'Hopital's rule.)

  • While I don't currently have anything to show for it, I've been playing around with using the fact that $\frac{d}{dx}(A+xB)^n=\sum_{i+j=n-1}A^iBA^j$. I feel like there is something useful lurking there, if only one can free it. – Aaron Jan 30 '20 at 04:57
  • In general, $\frac{d}{dt}\big|_{t=0}f(A+tB)=\tilde f(L_A,R_A)B$, where $\tilde f(\lambda,\mu)=\frac{f(\lambda)-f(\mu)}{\lambda-\mu}$ for $\lambda\neq \mu$ and $\tilde f(\lambda,\lambda)=f'(\lambda)$. So the question of an "explicit representation" boils down to finding representations of $\tilde f$ in the form $\tilde f(\lambda,\mu)=\sum_k \phi_k(\lambda)\psi_k(\mu)$, where the sum may be infinite (or even an integral). – MaoWao Jan 30 '20 at 10:53
  • In this special case, I don't think you will get anything substantieller nicer than $\frac{e^{\lambda}-e^{\mu}}{\lambda-\mu}=\int_0^1 e^{t\lambda}e^{(1-t)\mu},dt$, which yields the formula from the OP. – MaoWao Jan 30 '20 at 10:55
  • One more remark: The matrix $\tilde f(L_A,R_A)B$ is exactly the Hadamard product (entrywise product) of $B$ and the matrix with entries $\tilde f(\lambda_i,\lambda_j)$, where $\lambda_i$ are the eigenvalues of $A$, counted with multiplicity. – MaoWao Jan 30 '20 at 11:02
  • The last expression that you derived is basically the Daleckii-Krein theorem. – greg Jul 05 '23 at 18:30

2 Answers2

1

Given the $\lambda$-parameterized matrix definitions $$\eqalign{ C &= C(\lambda) &= A+\lambda B \\ E &= E(\lambda) &= \exp(C) \\ E'&= E'(\lambda)&= \frac{dE}{d\lambda} \\ }$$ an effective method to calculate the value of $\,E'(0)$ is the block-triangular method: $$\eqalign{ &F = \exp\Bigg(\begin{bmatrix}A&B\\0&A\end{bmatrix}\Bigg) = \begin{bmatrix}C(0)&E'(0) \\0&C(0)\end{bmatrix} \\ &E'(0) = \big[\matrix{I&0}\big]\,F \left[\matrix{0\\I}\right] \\ }$$ This method is quite general and works for any function, not just the exponential.

greg
  • 40,033
0

Not an answer, but here is an alternative, algebraic, elementary approach to the final formula you have. It rests on the observation that if $T:V\to W$ is a linear transformation, $(v_{\alpha})$ a basis for $V$, $(w_{\beta})$ a basis for $W$, and $(w^*_{\gamma})$ the corresponding dual basis, then if $[T]$ is the matrix of $T$ with respect to our bases, $[T]_{ij}=w_i^*Tv_j$.

By using the product rule for matrices that $\frac{d}{dx}(M(x)N(x))=M'(x)N(x)+M(x)N'(x)$ and induction, we get the formula

$$\frac{d}{dx}(M(x)^n)=\sum_{\substack{0\leq i,j \\i+j=n-1}}M(x)^iM'(x)M(x)^j.$$

Therefore $$\left.\frac{d}{dx}(A+Bx)^n\right\rvert_{x=0}=\sum_{i+j=n-1}A^iBA^j,$$ and so $$C:=\left.\frac{d}{dx}e^{A+Bx}\right\rvert_{x=0}=\sum_n\sum_{i+j=n-1}\frac{A^iBA^j}{n!}.$$

Let $u,v$ be left and right eigenvectors of $A$ so that $uA=\mu u$, $Av=\lambda v$. (Sidedness isn't necessary when working with symmetric matrices over $\mathbb R$, but I want to leave open the possibility of working with not-necessarily symmetric matrices). Then

$$uCv=(uBv)\sum_n\sum_{i+j=n-1}\frac{\mu^i\lambda^j}{n!}.$$

By the identity $a^n-b^n=(a-b)\displaystyle \sum_{i+j=n-1}a^ib^j$, we have

$$(\mu-\lambda)uCv=(uBv)\sum_n\frac{\mu^n-\lambda^n}{n!}=(uBv)(e^{\mu}-e^{\lambda}).$$


Actually, here is an extension of this idea which sort of gives a formula. Given $A\in \operatorname{GL}(V)$, define $L_A,R_A:\operatorname{End}(\operatorname{GL}(V))$ by $L_A(B)=AB, R_A(B)=BA$. Then $L_A$ and $R_A$ commute. We can then write

$$C=\sum_n\sum_{i+j=n-1}\frac{A^iBA^j}{n!}=\left(\sum_n\sum_{i+j=n-1}\frac{L_A^iR_A^j}{n!}\right)B.$$

If we multiply this on the left by $ad_A=(L_A-R_A)$, identical algebra as above yields $$[A,C]=e^{L_A}(B)-e^{R_A}(B)=e^AB-Be^A=[e^A,B].$$

This formula only determines $C$ up to a map commuting with $A$, but there be another way to to make use of this to find a formula for $C$ itself.

Aaron
  • 25,637