1

Let $A\in\mathbb{C}^{n\times n}$. What does it mean to make a Taylor expansion of $A$ around a point? I assume that the induced linear map is what should be expanded, but I have never seen a Taylor expansion for vector valued functions.

Furthermore, I have seen expression like $$A(t)=A + A'(t)t+ \mathcal{O}({t^2}),\ t\in [a,b]$$

I encountered it in numerical analysis, where some kind of continuation of a matrix was needed. What is the rigorous background to this?

EpsilonDelta
  • 2,169
  • You have to assume that $A\colon [a, b]\to \mathbb C^{n\times n}$, it does not make sense to Taylor expand a single matrix. – Giuseppe Negro May 13 '20 at 12:31

1 Answers1

1

Taylor's theorem holds in very general circumstances, you can take the domain and target space to be any Banach space. For this of course, you need to understand how differentiation works in Banach spaces. Take a look at this answer for a definition of differentiability, and this answer for a statement, (and outline of a proof) of Taylor's theorem.

If $V$ is a Banach space (for example, the space of matrices), $I \subset \Bbb{R}$ is an open interval around the origin, and $A: I \to V$ is a $k$-times differentiable at the origin, then by Taylor's theorem, we have \begin{align} A(t) &= A(0) + A'(0)t + \dots + \dfrac{A^{(k)}(0)}{k!}t^k+ o(|t|^k) \qquad \text{as $t \to 0$} \end{align} In particular, if you choose $k=1$, then \begin{align} A(t) &= A(0) + A'(0)t + o(|t|). \end{align} (note that the formula you wrote isn't quite right). By the way you should note that the case $k=1$ is exactly the definition of $A$ being a differentiable function at the origin; there's no need to invoke Taylor's theorem at all.


Suppose we specialize this further very explicitly to the case that $V = \Bbb{C}^{n\times n}$ is the space of matrices. Then one can prove that if $A(t) = (a_{ij}(t))$, then $A'(t) = (a_{ij}'(t))$. In other words, the derivatives are easy to calculuate; just do it component by component. Hence, \begin{align} (a_{ij}(t)) &= (a_{ij}(0)) + (a_{ij}'(0)) \cdot t + o(|t|). \end{align} So, in essence, if the target space is a (finite-dimensional normed) vector space, you just perform a Taylor expansion of the given function about the given point (in my answer I chose $t=0$) "component by component".

peek-a-boo
  • 65,833
  • This is true, but in the special case of this question everything is much easier. The matrix $A'(t)$ reduces to the matrix of derivatives $(a_{ij}')$, where $a_{ij}$ are the entries of $A$. – Giuseppe Negro May 13 '20 at 12:32
  • @GiuseppeNegro Sure, I just referenced the general case directly because the OP said they've never seen a Taylor expansion for vector-valued functions, so I wanted to indicate the fact that the theorem holds under very general circumstances. But now, I've also elaborated slightly on the special case which OP is interested in. – peek-a-boo May 13 '20 at 12:43
  • This is a very good answer! Can you suggest any good books on the topic of derviatives on Banach spaces which also treat Taylor approximation? – EpsilonDelta May 13 '20 at 12:56
  • 1
    @EpsilonDelta: I like Lang "Undergraduate analysis", it has a chapter on "Derivatives on vector spaces". – Giuseppe Negro May 13 '20 at 12:58
  • 1
    @EpsilonDelta I really liked Henri Cartan's Differential Calculus (the entire book is great), and also Loomis and Sternberg's Advanced Calculus (Chapter 3 is on differential Calculus, and this book is available freely online, just google it). – peek-a-boo May 13 '20 at 13:01
  • Do you by chance know Amann Escher - Analysis I-III? I got this books but don't know if they are any good. – EpsilonDelta May 13 '20 at 13:02
  • @EpsilonDelta yes, I'm working through volume III right now, and I like it very much; they're very detailed and treat everything in the general case as much as possible. I've glossed over volumes I and II, and they're definitely very thorough. Chapter 7 of volume II deals with multivariable differential calculus in the context of Banach spaces. But be warned that the books are not easy, they require a lot of effort; also, it is almost impossible to read it page-by-page; at certain points, you will definitely get stuck, and there are some topics which you have to skip on a first reading. – peek-a-boo May 13 '20 at 13:10
  • but depending on your background, they may or may not be suitable. If however, you have a firm background in Linear algebra and single-variable calculus (for example, like Spivak's Calculus) then Volume II will be accessible. – peek-a-boo May 13 '20 at 13:14