2

Let $A,B$ be matrices with dimension $N$. Define $ e^A:= I+A+\frac{A^2}{2!}+.... = \lim_{n \to\infty} \sum_{k=0}^n\frac{A^k}{k!}.$

Prove using limits if $AB = BA$ then $ e^{A+B} = e^Ae^B.$ I have observe several answers, but they are too quick and I couldn't understand them.

My first try is to prove $ \sum_{k=0}^{2n}\frac{(A+B)^k}{k!} -\sum_{k=0}^n\frac{A^k}{k!} \sum_{k=0}^n\frac{B^k}{k!} \to 0,$ and then push $n$ to infinity to obtain the promised result.
However, I got stuck in bounding the coefficient of $A^{p}B^{q}$ so I'm not sure if this is a good try.

Edit: In this post, I'm looking for a simple proof using limiting arguments, not ODE proofs or anything like that.

mathnoob
  • 351
  • 3
    Have you seen it proved for real $a,b?$ It is really the same proof, which is why the proofs are brief. – Thomas Andrews Jul 07 '24 at 21:10
  • Welcome on MSE. Do you know anything about Cauchy product for series of matrices? That could be one way to prove your statement. Otherwise, try to prove that $$\sum_{k=0}^{n}\frac{(A+B)^k}{k!}-\sum_{k=0}^n\frac{A^k}{k!}\sum_{k=0}^{n}\frac{B^k}{k!}\to 0$$ as $n$ goes to $+\infty$. Note that the first sum upper limit is $n$ and not $2n$ is what I wrote. – Kolakoski54 Jul 07 '24 at 21:12
  • 2
    If $A,B$ are diagonalizable, they are simultaneously diagonalizable since they commute. It is easily verified that $e^{A+B}=e^{A}e^{B}$ when $A,B$ are diagonal, and from this it follows that $e^{A+B}=e^A e^B$ holds when they are diagonalizable. The general case follows from a density argument. – Levent Jul 07 '24 at 21:14
  • @ThomasAndrews In fact, I didn't see anything like that... I will try Kolakoski54's hint! – mathnoob Jul 07 '24 at 21:22
  • @Kolakoski54 I got something like $$ \sum_{m=0}^n\sum_{k=n-m+1}^n \frac{A^k}{k!} \frac{B^m}{m!} $$ and dont know if that helps.... – mathnoob Jul 07 '24 at 21:53
  • Using Cauchy product: $\sum \frac{A^k}{k!}\sum \frac{B^\ell}{\ell !}=\sum C_n$ where $$C_k:=\sum_{k=0}^n \frac{A^k}{k!}\frac{B^{n-k}}{(n-k)!}$$ Maybe you can recognize something that looks like a binomial coefficient... Do you remember any formula for $(A+B)^n$ (provided that $AB=BA$)? Do not forget to justify the fact that you can apply the Cauchy product result (both series must converge, and at least one of them must converge absolutely). – Kolakoski54 Jul 07 '24 at 23:18
  • @Kolakoski54, I don't want to touch the limit at infinity immediately. What I tried from my upper comment was to show the limit of this difference of the two is 0: $\sum_{m=0}^n \sum_{k=n-m+1}^n \frac{A^k}{k!} \frac{B^m}{m!} \rightarrow 0$, but I don't know how to bound this term... Ah I see, you are referring to Cauchy theorem for the product for a direct application... – mathnoob Jul 07 '24 at 23:43
  • Since you are looking to justify the convergence you need to specify a distance function or maybe you intend the solver to pick a convenient distance function. – Steen82 Jul 08 '24 at 01:56

2 Answers2

7

Here is a fun way to do it.

Lemma 1: $\frac{d}{dt} e^{tX} = X e^{tX}$.

This is a slightly annoying calculation with the definition of the exponential you've chosen (it's cleaner to take the above as the definition) since we need to exchange a derivative and an infinite sum, but not so bad.

Lemma 2: If two matrices $X, Y$ commute, then $X$ and $e^{tY}$ also commute.

This can be done directly using the power series definition, or indirectly by arguing that $F(t) = Xe^{tY} - e^{tY}X$ has derivative $0$ so must be a constant, and since $F(0) = 0$ it must be identically zero.

Lemma 3: $e^{-tA} = (e^{tA})^{-1}$.

This is done by using Lemma 1 to calculate

$$\begin{align} \frac{d}{dt} e^{tA} e^{-tA} &= A e^{tA} e^{-tA} - e^{tA} A e^{-tA} \\ &= A e^{tA} e^{-tA} - A e^{tA} e^{-tA} \\ &= 0 \end{align}$$

where in the second line we use that $A$ commutes with $e^{tA}$, by Lemma 2. So $e^{tA} e^{-tA}$ must be a constant. Since it's equal to $1$ when $t = 0$, it must be equal to $1$ identically. This gives $e^{tA} e^{-tA} = 1$ as desired.

Now we can prove the desired result with another derivative calculation using Lemma 1, namely

$$\begin{align} \frac{d}{dt} e^{t(A+B)} e^{-tB} e^{-tA} &= (A + B) e^{t(A+B)} e^{-tB} e^{-tA} - e^{t(A+B)} B e^{-tB} e^{-tA} - e^{t(A+B)} e^{-tB} A e^{-tA} \\ &= (A + B) e^{t(A+B)} e^{-tB} e^{-tA} - B e^{t(A+B)} e^{-tB} e^{-tA} - A e^{t(A+B)} e^{-tB} e^{-tA} \\ &= 0\end{align}$$

where in the second line we used that $B$ commutes with $e^{t(A+B)}$ and $A$ commutes with $e^{-tB}$ and $e^{t(A+B)}$, again by Lemma 2. It follows as before that $e^{t(A+B)} e^{-tB} e^{-tA}$ is equal to $1$ when $t = 0$ so must be equal to $1$ identically. This gives

$$e^{t(A+B)} e^{-tB} e^{-tA} = 1$$

and applying Lemma 3 twice gives $e^{t(A+B)} = e^{tA} e^{tB}$ as desired.

It is completely unnecessary to mess around with power series in this approach, or rather you only need to mess around with power series long enough to establish the key facts about $e^{tX}$ in terms of derivatives, and then the rest of the argument is just some nice derivative calculations. These even work in the $1$-dimensional case and can be used to prove the basic properties of the exponential there too, as well as over the complex numbers.

Qiaochu Yuan
  • 468,795
  • 1
    I guess we also need the product rule for matrix-valued functions which is $\frac{d}{dt} X(t) Y(t) = X'(t) Y(t) + X(t) Y'(t)$, which is used several times above and the proof is basically the same as the ordinary product rule. The only thing to notice is that it's crucial to write the product in this order because of noncommutativity. – Qiaochu Yuan Jul 07 '24 at 21:35
  • 2
    Please avoid answering duplicates. See: https://math.stackexchange.com/questions/1558667/a-proof-of-eab-eaeb-using-odes?noredirect=1 – Kavi Rama Murthy Jul 07 '24 at 23:17
4

We begin by having a few observations:

$${\left( A + B \right)}^{2} = {A}^{2} + A B + B A + {B}^{2},$$

$$\begin{align} {\left( A + B \right)}^{3} = {A}^{3} & + {A}^{2} B + A B A + B {A}^{2} \\ & + {B}^{2} A + B A B + A {B}^{2} + {B}^{3}. \end{align}$$

$$\begin{align} {\left( A + B \right)}^{4} = {A}^{4} & + {A}^{3} B + {A}^{2} B A + A B {A}^{2} + B {A}^{3} \\ & + {A}^{2} {B}^{2} + A B A B + B {A}^{2} B \\ & + A {B}^{2} A + B A B A + {B}^{2} {A}^{2} \\ & + {B}^{3} A + {B}^{2} A B + B A {B}^{2} + A {B}^{3} + {B}^{4}. \end{align}$$

If $A B = B A$, then

$${\left( A + B \right)}^{2} = {A}^{2} + 2 A B + {B}^{2},$$

$${\left( A + B \right)}^{3} = {A}^{3} + 3 {A}^{2} B + 3 A {B}^{2} + {B}^{3},$$

$${\left( A + B \right)}^{4} = {A}^{4} + 4 {A}^{3} B + 6 {A}^{2} {B}^{2} + 4 A {B}^{3} + {B}^{4}.$$

It seems that the scalar-valued binomial theorem applies; that is,

$$\begin{align} {\left( A + B \right)}^{n} & = \sum_{k = 0}^{n} \binom {n}{k} {A}^{n - k} {B}^{k} \\ & = {A}^{n} + n {A}^{n - 1} {B}^{2} + \frac {n \left( n - 1 \right)}{2} {A}^{n - 2} {B}^{2} + \frac {n \left( n - 1 \right) \left( n - 2 \right)}{6} {A}^{n - 3} {B}^{3} + \cdots + {B}^{n}. \end{align}$$

With this in our minds, we return to the question. We begin by showing that

$$\exp \left( A \right) \exp \left( B \right) = \bigg( \sum_{i \ge 0} \frac {{A}^{i}}{i!} \bigg) \bigg( \sum_{j \ge 0} \frac {{B}^{j}}{j!} \bigg) = \sum_{i \ge 0} \sum_{j \ge 0} \frac {{A}^{i} {B}^{j}}{i! \, j!} = \sum_{i \ge 0} \sum_{j \ge 0} \frac {1}{\left( i + j \right)!} \binom {i + j}{j} {A}^{i} {B}^{j}.$$

Next, we perform the key substitution: Let $i + j = k$, so $$\sum_{i \ge 0} \sum_{j \ge 0} \frac {1}{\left( i + j \right)!} \binom {i + j}{i} {A}^{i} {B}^{j} = \sum_{k \ge 0} \frac {1}{k!} \sum_{j \ge 0} \binom {k}{j} {A}^{k - j} {B}^{j} = \sum_{k \ge 0} \frac {{\left( A + B \right)}^{k}}{k!} = \exp \left( A + B \right),$$

as claimed. $\blacksquare$

Simon
  • 1,486
  • 2
    Strictly speaking there is a small convergence argument that needs to be made here. Everything converges absolutely so it's not much of a problem but it does need to be mentioned. – Qiaochu Yuan Jul 07 '24 at 23:44
  • @Simon I have seen this argument, but I really care is the all the justifications within the limits... – mathnoob Jul 07 '24 at 23:51