2

Is there an alternative definition of a matrix exponential so I can use it to prove that $$e^{A}=\sum_{m=0}^{\infty} \frac{1}{m!}(A)^m \;?$$

Thanks a lot in advance!

gcolucci
  • 405
  • 1
    This definition is used not only because it is related to the real exponential, but also because it allows the definition of the exponential in any Banach algebra (in such space, absolute convergence implies convergence). – Gabriel Romon Mar 22 '14 at 17:55

6 Answers6

7

It's been some time since I've looked at it, but I believe I have occasionally seen the definition

$$ e^A = \lim_{n \to \infty} \left(I + \frac{1}{n} A \right)^n $$

If you're being asked to prove the Taylor series formula, you really ought to consult your textbook to see how your textbook defines the exponential, or other formulas for it that it has proven.

  • Hi Hurkyl, thanks for your response. Actually, this exercise is from a list meant to be a source to review some Algebra concepts, but not related to a textbook. So I have really no extra input, except for the question itself. Do you think this limit-based definition may lead to the expansion formula? – gcolucci Mar 22 '14 at 16:52
  • 1
    @user137227 Chapter 3 (page 33) of Brian C. Halls notes on Matrix Lie groups might be interesting reading where a proof of the equivalence of the series definition and (a generalization) of Hurkyls limit version is provided in Proposition 3.8 on page 39. There might exist a more direct route to the equivalence but there you will at least find one. – Squid Mar 22 '14 at 17:09
  • @Squid thanks a lot! It definitely seems very thoughtful, but I still couldn't picture how to go from a definition to the other =/ – gcolucci Mar 22 '14 at 18:29
6

No. This is how matrix exponents are defined. In fact, this is how we usually define exponentiation, for example, in $\mathbb{C}$ it's also defined in terms of $e$. Why do you want to anyways?

  • Thanks for the answer, Stella. I've come across this question as an exercise (to proof this definition using Taylor expansion), and I predictably couldn't figure a solution to it. – gcolucci Mar 22 '14 at 16:42
3

$$\begin{align}&\lim_{n \to \infty} \left(I + \frac{1}{n} A \right)^n = \\ &\lim_{n \to \infty} \left(I^n+(n)I^{n-1}\frac{A}{n}+\frac{(n)(n-1)}{2}I^{n-2}\frac{A^2}{n^2} + \frac{(n)(n-1)(n-2)}{6}I^{n-3}\frac{A^3}{n^3} +\;...\right) = \\ &\lim_{n \to \infty}\left(I + A+\frac{(n)(n-1)}{2n^2}A^2 + \frac{(n)(n-1)(n-2)}{6n^3}A^3 ...\right)= \\ &\lim_{n \to \infty} I + \lim_{n \to \infty}A + \lim_{n \to \infty}\frac{(n)(n-1)}{2n^2}A^2 + \lim_{n \to \infty}\frac{(n)(n-1)(n-2)}{6n^3}A^3 +\; ...\; = \\ &I + \frac{A}{1!} + \frac{A^2}{2!} + \frac{A^3}{3!} +\; ... \;= \\ &\sum_{m=0}^\infty\frac{A^m}{m!} = e^A\end{align}$$

Pretty sure that binomial expansion for matrices is valid when the two matrices commute so this is valid because $IA = AI.$

Brad
  • 5,252
  • 2
  • 21
  • 51
  • That was definitely the most straightforward approach. I'm not 100% that the binomial expansion is valid though (although it seems to make much sense to me). Thanks a lot! – gcolucci Mar 24 '14 at 06:46
  • 1
    The binomial expansion is valid when the matrices commute, so that is OK. The implication fro line 3 to line 4 needs more work. Brad is taking a limit of a sum of $n$ terms, but $n \to \infty.$ The limit is actually correct, but it needs a bit more justification than this ( even for real numbers). – Geoff Robinson Mar 24 '14 at 09:01
  • 1
    To elaborate on Geoff's comment, this proof is incomplete (at best). The number of terms in the parentheses in lines 1-3 is $n$, while from line 4 we have an infinite number of addends. It is also silently assuming that all the terms of the form $o(1/n)A^k$ add up to something that vanishes as $n\to\infty$. This is not obvious. The sum on line 3 can be rearranged as $I+A+A^2/2+\cdots+A^n/n! + R_n$, where the remainder is $R_n=c_2A^2+c_3A^3+\cdots+c_nA^n$, where $c_i=o(1/n)\ \forall i$ (o is the little-o Landau symbol). One needs to show that $R_n$ vanishes as $n\to\infty$. – bartgol Apr 17 '18 at 19:03
2

As Stella said, no other way it is defined. That said, there are different ways to calculate it instead of infinite sum.

For instance the exponential of a diagonal matrix can be found by simply exponentiating every diagonal entry.

Going further, for any matrix non-singular $A$ you want to find the exponential of, the best way is to write it as: $$A = P^{-1}DP$$ where $D$ is a diagonal matrix found using eigenvectors/eigenvalues. Then $$e^A = P^{-1}e^{D}P$$ where the exponential of $D$ is just elements of $D$ being exponentiated as it is diagonal so easy to calculate.

There are many other ways including using Jordan forms to calculate exponentials of a matrix.

user88595
  • 4,579
1

Following on from the remark of user88595, it is useful to know how to compute the exponential of a nilpotent matrix $M$ which has a single Jordan block. So suppose that $M$ is an $n \times n$ matrix with $M^{n} = 0 \neq M^{n-1}.$ Then $\{I,M,M^{2},\ldots , M^{n-1} \}$ is linearly independent and $e^{M} = I + M + \frac{M^{2}}{2!} + \ldots + \frac{M^{n-1}}{(n-1)!}.$ More generally, it is easy to see that $e^{(\lambda I + M)} = e^{\lambda}(I + M + \frac{M^{2}}{2!} + \ldots + \frac{M^{n-1}}{(n-1)!}).$ This enables us to compute $e^{J_{r}(\lambda)},$ where $J_{r}(\lambda)$ is a Jordan block of size $r$ with unique eigenvalue $\lambda.$ Hence we can calculate $e^{X}$ for any matrix $X$ in Jordan Normal Form.

1

You could define the matrix exponential as the solution of a differential equation, just like you can define the (real or complex) number $e^a$ as $f(1)/f(0)$ where $f$ is any nonzero function defined on an open interval$~I$ containing $[0,1]$ and satisfying the differential equation $f'(x)=af(x)$.

Similarly for any vector $x_0\in V$ there is a vector valued function $f:I\to V$ satisfying the linear differential equation $f'(x)=Af(x)$, and putting $x_1=f(1)$ for this function, the map $V\to V$ sending $x_0\mapsto x_1$ is linear, and equal to $\exp(A)$.