1

I was doing an exercise on Apostol's Calculus book (volume 1, exercise 31 on page 304, https://www.stumblingrobot.com/2016/01/13/prove-some-properties-of-the-function-e-1x2/) where I was asked to find the $n$th derivative of $f(x)$ at $0$, $f(x)$ defined as $e^{-1/x^2}$ for $x \not = 0$ and $0$ for $x = 0$. I believe this function and all of its derivatives are continuous since $\lim_{x \to 0} e^{-1/x^2} = 0$, and $\lim_{x \to 0} f^{n}(x) = 0$ for all $n$. But this would imply the Taylor series for the function is just $g(x) = 0$, which doesn't make sense to me since the Taylor series isn't approaching the function and seems to give too large of an error $(E_n(x) = \frac{1}{n!}\int_{0}^{x} (x-t)^nf^{(n+1)}(t) dt=\frac{1}{n!}\int_{0}^{x} (x-t)^n(0)dt = 0$).

Could someone explain what I'm overlooking here please? Thank you!
RobPratt
  • 50,938
  • 9
    You're not overlooking anything. That's exactly the purpose of this problem: to show you that you can have smooth functions whose Taylor series centered at the origin has infinite radius of convergence but the Taylor series is not equal to the original function. That's why we have another class of functions called 'analytic functions'. Real analysis is full of such pathologies :) – peek-a-boo Feb 20 '23 at 00:37
  • @peek-a-boo But not complex analysis, right? – Parcly Taxel Feb 20 '23 at 00:40
  • @ParclyTaxel well the part of complex analysis dealing with holomorphic functions only doesn't (and there are so many fun rigidity theorems there). – peek-a-boo Feb 20 '23 at 00:41
  • @peek-a-boo All right, thanks! Would it be fair to say then that the error given in the book (the formula above, and the other ones it implies) only works for estimating/expressing the difference between a taylor series and the original function if the taylor series eventually converges to the original function (which it doesn't have to, as I think you said this exercise is meant to show)? – Willcarbog Feb 20 '23 at 00:44
  • This function is one you’ll want to keep in mind during your studies and after. Occasional useful for developing counterexamples. – A rural reader Feb 20 '23 at 00:46
  • 1
    The error formula involving the integral is always true. Look at the statement of Taylor's theorem. But why did you say it is equal to zero? It is not. Note that $f^{(n+1)}$ only vanishes at the origin, so the integral is not zero. Here is just one of many answers regarding similar notions. – peek-a-boo Feb 20 '23 at 00:46
  • 1
    I see - I made a mistake with the error formula but I think I understand what you mean now (error is always correct, but that error doesn't always have to tend to 0 as you keep expanding the taylor series and a taylor series with infinite terms isn't always the same as the original function). Thanks again! – Willcarbog Feb 20 '23 at 00:54
  • Echoing @peek-a-boo's comment: the finite Taylor-Maclaurin expansion with remainder is valid for all smooth functions, without assuming analyticity (that is, without assuming that the implied power series converges to the original function (which it may not), and without assuming that the implied power series converges at all). :) :) :) Crazy, I think! :) – paul garrett Feb 20 '23 at 00:55
  • You need to think carefully what is the difference between $f(x)=f(0)+\Sigma_{n=1}^{k} a_nx^n+o(x^k)$ and $f(x)=f(0)+\Sigma_{n=1}^{+\infty} a_nx^n$. They are obviously not the same. – Asigan Feb 20 '23 at 07:57

0 Answers0