4

From what I can understand, the remainder is how much difference there is between the function itself and the polynomial approximation. And the radius of convergence is related to the series representation of the polynomial approximation, and how its convergence could be tested by the ratio test.

But what's the difference between them when they seem to tell you the same thing?

For example, since $\sin(x)$ is $$ \sum_{n=0}^\infty(-1)^n \frac{x^{2n+1}}{(2n+1)!}, $$ it seems that we can find that it converges for all values of $x$ by either

  1. making to the remainder becomes zero (by having it approach $0$ by having $n$ approach $\infty$.
  2. showing that the ratio test has convergence criteria

My other example would be $e^x$, but that seems easier to do the ratio test on since the $f^{(n+1)}(c)$ term isn't bounded.

It seems that doing the ratio test for $\sin(x)$ will give me the same result as making the remainder $0$, and finding that the remainder is $0$ for $e^x$ is the same result as the ratio test.

  • 3
    The convergence and remainder are completely different ideas - given the series converges, the Lagrange remainder tells you how fast the series converges. However, the ratio test is used to test whether the series converges at all. – K.defaoite Jul 30 '20 at 17:16
  • 1
    You've correctly pointed out that these two different mathematical objects can be used to accomplish the same goal. But that doesn't at all imply that there's no difference between the objects (nor that they have no other individual uses), any more than the observation that both shoes and newspapers can be used to swat flies :) – Greg Martin Jul 30 '20 at 17:27

2 Answers2

4

Note that "Taylor series converges for all $x$" is a completely different statement than "Taylor series equals the original function" (or more commonly phrased as "Taylor series converges to the original function"), and it is this difference which I think you haven't understood

Let $f:\Bbb{R} \to \Bbb{R}$ be a given infinitely differentiable function, and let $a\in \Bbb{R}$ be a given. Then, we can consider three different functions:

  • For each integer $n\geq 0$, we can consider the $n^{th}$ Taylor polynomial for $f$ about the point $a$, $T_{n,a,f}:\Bbb{R} \to \Bbb{R}$ defined by \begin{align} T_{n,a,f}(x) := \sum_{k=0}^n \dfrac{f^{(k)}(a)}{k!}(x-a)^k \end{align}
  • Accordingly, we can consider the $n^{th}$ order remainder function for $f$ about the point $a$, $R_{n,a,f}:\Bbb{R} \to \Bbb{R}$ and this is defined by $R_{n,a,f}:= f- T_{n,a,f}$.
  • Finally, we can consider the Taylor Series of $f$ about the point $a$. To define this, we first consider the formal power series $S(X) := \sum\limits_{k=0}^{\infty}\frac{f^{(k)}(0)}{k!}X^k$. This has a certain radius of convergence $0 \leq \rho \leq \infty$ (the Cauchy-Hadamard formula gives an explicit formula for $\rho$ in terms of the coefficients of the series). Now, we define the Taylor series $S_{a,f}$, of the function $f$ about the point $a$, as follows: if $\rho = 0$, we define $S_{a,f}: \{a\} \to \Bbb{R}$, by $S_{a,f}(a) := f(a)$. If $\rho >0$ then we define $S_{a,f}: (a-\rho,a+\rho) \to \Bbb{R}$ by \begin{align} S_{a,f}(x) := \sum_{k=0}^{\infty}\dfrac{f^{(k)}(a)}{k!}(x-a)^k = \lim_{n\to \infty}T_{n,a,f}(x) \end{align} (with the understanding that if $\rho = \infty$, then the domain is $\Bbb{R}$)

You seem to be interested in the case where $\rho = \infty$, so that $S_{a,f}$ has its domain equal to all of $\Bbb{R}$, so ok let's focus on this case. Now, there is a very natural question to ask, namely, does the function equal its Taylor series? i.e is it true that $f = S_{a,f}$ (or more explicitly, is it true that for every $x\in \Bbb{R}$, $f(x) = S_{a,f}(x)$?).

The answer is NOT NECESSARILY, even if we assume $\rho = \infty$. The typical counter-example is given by $f:\Bbb{R}\to \Bbb{R}$ defined as \begin{align} f(x) &:= \begin{cases} e^{-\frac{1}{x^2}} & \text{if $x\neq 0$} \\ 0 & \text{if $x=0$} \end{cases} \end{align} Then, you can check that $f$ is infinitely-differentiable, and that for every $k$, $f^{(k)}(0) = 0$. So, the radius of convergence is $\rho = \infty$, and the Taylor series of $f$ about the origin is $S_{0,f}:\Bbb{R} \to \Bbb{R}$, $S_{0,f}(x) = 0$ for all $x$. Now, clearly $f$ is not the constant zero-function, so $f\neq S_{0,f}$.

Given this result, the next natural question to ask is "under what conditions (if any) is the function equal to its Taylor series?" The answer to this is pretty simple. Well, fix an $x \in \Bbb{R}$. Then, by definition of Taylor polynomial and Remainder, we have for every integer $n\geq 0$: \begin{align} f(x) &= T_{n,a,f}(x) + R_{n,a,f}(x) \end{align} Since this is true for all $n\geq 0$, we can also take the limit as $n \to \infty$ on both sides to get: \begin{align} f(x) &= \lim_{n\to \infty} \bigg(T_{n,a,f}(x) + R_{n,a,f}(x)\bigg) \\ &= S_{f,a}(x) + \lim_{n\to \infty} R_{n,a,f}(x) \end{align} Therefore, $f(x) = S_{f,a}(x)$ if and only if $\lim\limits_{n\to \infty}R_{n,a,f}(x) = 0$.

With the counter-example and the result above in mind, we can understand the difference between radius of convergence and remainder:

  • The radius of convergence of the Taylor series is simply a number $\rho$. All it tells you is for what values of $x$ does the series even converge (because recall that the Taylor series is defined as the limit $\lim_{n\to \infty}T_{n,a,f}(x)$ provided the limit exists, so we are asking when does this limit exist in $\Bbb{R}$). Things like ratio test/ root-test/alternating-test or any other "series test" you may have learnt are merely techniques/tools in helping you find out what the radius of convergence $\rho$ is (sure there is an explicit formula given by the Cauchy-Hadamard formula, but sometimes that's very difficult to calculate with, so we try to look for simpler alternatives). BUT, the Radius of convergence tells you NOTHING about whether or not (within the intrval of convergence) the Taylor series $S_{a,f}$ is actually equal to the function $f$.

  • The remainder $R_{n,a,f}$ is by definition the difference between $f$ (the actual) and $T_{n,a,f}$ (the approximation). It gives a quantitative measure of how good your approximation is. Also, if the Taylor series converges at a point $x$, then the limit $\lim_{n\to \infty}R_{n,a,f}(x)$ will exist. This limit may or may not be zero, and as shown above, we have $f(x) = S_{f,a}(x)$ if and only if this limit is $0$. So, the (limit of) remainder allows you to answer the question "is my function equal to its Taylor series everywhere".

peek-a-boo
  • 65,833
  • 2
    That's a very good and detailed answer! Thanks for having taken the time to write it, I'm very sure it'll help many students ! – Anthony Jul 30 '20 at 18:50
  • Obviously you took way more time than I did to give a thorough and detailed answer to the question. I only added one counter example to the list. Thanks for that! – Benjamin Jul 30 '20 at 19:09
  • Thanks @peek-a-boo, I think I see the difference now. Although kinda a silly example, although the Taylor series for $e^x$ might converge, the remainder isn't zero if we were trying to approximate $\sin(x)$. So the remainder doesn't have to be zero if it converges. But the oppose seems true: that is, if the remainder is zero then the series would need to converge. –  Jul 31 '20 at 01:14
  • @Sat I'm not sure I see your point. Of course, the remainder for any particular $n$ could be non-zero, but for functions like $\exp$ and $\sin$ (and $\cos, \cosh, \sinh$ etc) the limit as $n \to \infty$ for the remainder is zero. – peek-a-boo Jul 31 '20 at 01:18
  • It's probably because the point I'm trying to make is something obvious.

    To give specific examples, (1) I could find out that the series for $e^x$ converges for all $x$, but the remainder is non-zero if I was originally trying to find an approximation for $\sin(x)$. I understand now that Taylor series convergence doesn't necessarily mean it converges to the original functions. (2) But it seems it seems that if a series approximates a function so that there is no remainder, then that series would have to converge (or at least converge for a certain radius)

    –  Jul 31 '20 at 01:36
0

A standard counter example for the problem you are stating is the following. $$ f(x)=\begin{cases}e^{-1/x} & \mbox{for}\ x>0 \\ 0 & \mbox{for}\ x\le 0\end{cases} $$ It is a $C^\infty$ function with derivatives of the form $$ f^{(n)}(x)=\begin{cases}\frac{p_n(1/x)}{x^{2n}}e^{-1/x} & \mbox{for}\ x>0 \\ 0 & \mbox{for}\ x\le 0,\end{cases} $$ where $p_n$ is a polynomial of degree smaller or equal to $n$. The Taylor expansion of $f$ at zero therefore is $T_f(x)=0$. Its radius of convergence is $\infty$. The remainder on the contrary obviosuly does'nt converge to zero for any $x>0$.

Benjamin
  • 841