3

Background

I am solving an assignment on the fundamentals of Quantum Computing which mostly has questions on Linear Algebra. All matrices and vectors are over complex domains.

Question

Define the exponent of a matrix as $$e^{\lambda A} = \sum_{i = 0}^{\infty}\frac{\lambda^i}{i!}A^i$$ where $A^0 = \mathbb{I}$ or the identity operator.

  1. Given a matrix $B$, the matrix logarithm of $B$ is defined as the matrix $A$ such that $e^A = B$. Prove that $\log$ is a non-unique function (i.e. for every $B$, there are several $A$ that satisfy the matrix formula given).
  2. Prove that for two matrices, if $AB = BA$, then $\log(AB) = \log(A) + \log(B)$.

My approach

  1. For this part, I am firstly not sure about the existence of an $A$, it doesn't seem obvious to me why such an $A$ should always exist and if so, how to compute it. Nonetheless, assuming that we have a solution to $\log(B) = A$, then I try to prove that I can derive a family of solutions from it (this is my intuition) $$\therefore B = \mathbb{I} + A + \frac{A^2}{2!} + \frac{A^3}{3!} + \ldots $$ Assume we have an $A' = A + 2\pi i \mathbb{I}$ such that $$ e^{A'} = \mathbb{I} + (A+2\pi i)\mathbb{I} + \frac{({A + 2\pi i\mathbb{I}})^2}{2!} + \frac{({A + 2\pi i\mathbb{I}})^3}{3!} + \ldots $$ Since $A$ and $\mathbb{I}$ always commute, we can write the sum as (the proof is just algebraic manipulation and application of the binomial theorem, so I've skipped it) $$e^{A'} = \left(\mathbb{I} + A + \frac{A^2}{2!} + \frac{A^3}{3!} + \ldots\right)\left(\mathbb{I} + \frac{2\pi i}{1!}\mathbb{I} + \frac{(2\pi i)^2}{2!}\mathbb{I} + \frac{(2 \pi i)^3}{3!} \mathbb{I} + \ldots \right)$$ $$\therefore e^{A'} = e^A \mathbb{I} \left(1 + \frac{2\pi i}{1!}+ \frac{(2\pi i)^2}{2!}\ + \frac{(2 \pi i)^3}{3!} + \ldots \right)$$ $$\therefore e^{A'} = e^A \mathbb{I} e^{2 \pi i} = B$$ Similarly $\forall n \in \mathbb{Z}, \exp{(A + 2n \pi i \mathbb{I})} = B$ This finishes the proof
  2. Assume $e^X = AB, e^Y = A, e^Z = B$ To prove that $X = Y + Z$ We write out the expansions of $A$ and $B$ and then use the fact that $AB = BA$ $$AB = \mathbb{I} + (Y + Z) + \frac{Y^2 + 2YZ + Z^2}{2!} + \frac{Y^3 + 3Y^2Z + 3YZ^2 + Z^3}{3!} + \ldots$$ $$BA = \mathbb{I} + (Y + Z) + \frac{Y^2 + 2ZY + Z^2}{2!} + \frac{Y^3 + 3ZY^2 + 3Z^2Y + Z^3}{3!} + \ldots$$ Put $AB - BA = \mathbb{O}$ $$\therefore (YZ - ZY) + \frac{1}{2!} (Y^2Z - ZY^2) + \frac{1}{2!}(YZ^2 - Z^2Y) + \ldots = 0$$

This is where I am stuck Obviously if $Y$ and $Z$ commute, then the above expression equates to 0 and thus from the expansions of $AB$ and $BA$, if $Y$ and $Z$ would commute, $AB = BA = e^{Y+Z} = e^X$ and thus our proof is complete.

The question is: how do I prove that this is the only case where the summation is 0? If $Y$ and $Z$ don't commute, then is it possible to proceed with the proof? Are there any alternate approaches?

kaddy
  • 119
  • Do you have conditions on the matrix $A=e^X$? The matrix logarithm doesn’t exist if $A$ is singular. – Semiclassical Jan 22 '24 at 13:40
  • Unfortunately, this is all the information that's given in the problem statement. I am not aware of this property for singular matrices, so that could be an interesting side question! – kaddy Jan 22 '24 at 13:44
  • 1
    If $[A,B]=[e^Y,e^Z]=0$ then it is not necessarily the case that $[Y,Z]=0$, cf. https://math.stackexchange.com/questions/349180/if-ea-and-eb-commute-do-a-and-b-commute-for-finite-dimensional-matric – Giulio R Jan 22 '24 at 13:49
  • 1
    This doesn’t sound right if you pick a fixed (say, principal) branch of complex logarithm. E.g. $\log\big((-1)(-1)\big)=\log1=0\ne2i\pi=\log(-1)+\log(-1)$. – user1551 Jan 22 '24 at 14:47
  • $e^X$ is always nonsingular so it make sense to restrict yourself to the non-singular case for $A=e^X$. – CyclotomicField Jan 23 '24 at 01:49

1 Answers1

2

As a comment to your question alludes to, you can only take the logarithm of invertible matrices because $e^{A}$ is necessarily invertible: $e^{A}e^{-A}=I$.


As for the second part of your question, as you've said, it's just about showing that $e^C e^D=e^{C+D}$ whenever $C$ and $D$ commute.

By the way, the expanded power-series sum on each side is absolutely convergent, which means that it doesn't matter in which order you add up the terms of the sum. That's the formal justification for then rearranging the terms until they fit the same proof in basic analysis which shows that $e^{x+y}=e^x e^y$ for all complex numbers $x$ and $y$.