0

Are all ODE solutions analytic and representable by Taylor series?

I'm exploring how Taylor expansions relate to the solutions of ordinary differential equations (ODEs). Starting from the basic definition of the derivative, I constructed the following reasoning:

Let

$$ f'(a) = \lim_{\epsilon \to 0} \frac{f(a + \epsilon) - f(a)}{\epsilon} $$

This gives the first-order approximation:

$$ f(a + \epsilon) = f(a) + \epsilon f'(a) \tag{1} $$

Define a discrete sequence of points:

$$ f[n] = f(a + n\epsilon) $$

So:

$$ f[0] = f(a), \quad f[1] = f(a + \epsilon), \quad f[2] = f(a + 2\epsilon), \ldots $$

We can expand this step-by-step:

  1. First-order step: $$ f[1] = f[0] + \epsilon f'[0] \tag{2} $$

  2. Next step: $$ f[2] = f[1] + \epsilon f'[1] = f[0] + \epsilon f'[0] + \epsilon f'[1] \tag{3} $$

Using: $$ f'[1] = f'[0] + \epsilon f''[0] \tag{4} $$

Substitute (4) into (3): $$ f[2] = f[0] + 2\epsilon f'[0] + \epsilon^2 f''[0] \tag{5} $$

Continuing this pattern, we obtain: $$ f[n] = \sum_{k=0}^{\infty} \binom{n}{k} \epsilon^k f^{(k)}(a) $$

By letting $ x = a + n\epsilon$, this resembles the Taylor series: $$ f(x) = \sum_{k=0}^{\infty} \frac{(x - a)^k}{k!} f^{(k)}(a) $$

So this derivation intuitively shows that derivatives encode local information, and Taylor series propagate this locally to estimate global behavior. This makes me think:

  • As long as the function is infinitely differentiable, Taylor expansion should represent the ODE solution.

However, we know not all smooth functions are analytic. A famous counterexample is:

$$ f(x) = \begin{cases} e^{-\frac{1}{x}} & \text{if } x > 0, \\ 0 & \text{if } x \leq 0 \end{cases} $$

This function is smooth everywhere, but its Taylor series at ( x = 0 ) is identically zero and does not match the function for ( x > 0 ).

But from a dynamical systems perspective, such a function seems unphysical: if all derivatives at a point (position, velocity, acceleration, etc.) are zero, then the object shouldn't move. Yet this function “spontaneously” becomes non-zero.


So here’s my question:

Can we conclude that any solution to a physical ODE (such as from a dynamical system) must be analytic?

Or are there meaningful physical systems that admit non-analytic (but smooth) solutions?


This is full version of my proof

Now extend this to the next point:

$$ f[3] = f[2] + \epsilon f'[2] \tag{6} $$

Using the same logic recursively:

$$ f'[2] = f'[1] + \epsilon f''[1] = f'[0] + \epsilon f''[0] + \epsilon(f''[0] + \epsilon f^{(3)}[0]) = f'[0] + 2\epsilon f''[0] + \epsilon^2 f^{(3)}[0] \tag{7} $$

Substitute this into (6):

$$ f[3] = f[2] + \epsilon \left( f'[0] + 2\epsilon f''[0] + \epsilon^2 f^{(3)}[0] \right) \tag{8} $$

Now, using equation (5) for ( f[2] ), we get:

$$ f[3] = f[0] + 2\epsilon f'[0] + \epsilon^2 f''[0] + \epsilon f'[0] + 2\epsilon^2 f''[0] + \epsilon^3 f^{(3)}[0] $$

Combine terms:

$$ f[3] = f[0] + 3\epsilon f'[0] + 3\epsilon^2 f''[0] + \epsilon^3 f^{(3)}[0] \tag{9} $$

Now, I notice that the coefficients of each term form Pascal’s triangle, so we can express it as:

$$ f[n] = \sum_{k=0}^{\infty} \binom{n}{k} \epsilon^k f^{(k)}(a) $$

Or equivalently:

$$ f[n] = \sum_{k=0}^{\infty} \frac{n!}{k!(n - k)!} \, \epsilon^k \, f^{(k)}(a) $$

Now, if we want to find $ f(a+x) $, we can set $N = \frac{x}{\epsilon}$, but as $\epsilon $ approaches $0$, $ N$ approaches infinity, so:

$$ f(a+x) = \lim_{N \to \infty} f[N] $$

Now we get:

$$ f(a+x) = \lim_{N \to \infty} \sum_{k=0}^{\infty} \frac{N!}{k!(N - k)!} \, \epsilon^k \, f^{(k)}(a) $$

As $ N \gg k $, we approximate $ \frac{N!}{(N - k)!} \approx N^k $, so:

$$ f(a+x) = \lim_{N \to \infty} \sum_{k=0}^{\infty} \frac{(N\epsilon)^k}{k!} \, f^{(k)}(a) $$

Since $N = \frac{x}{\epsilon} $, we have $ N\epsilon = x $, so:

$$ f(a+x) = \sum_{k=0}^{\infty} \frac{x^k}{k!} \, f^{(k)}(a) $$

Update

I've consider the $y' = sqrt(1 - y^2)$, here the result I've got from rk45 enter image description here

However, if we do Taylor expansion within the interval [-1,1], the derivative of $y'(x) $respect to the order is alternating, matching Taylor expansion of sine function.

From the plot, I saw the trajectory follow sine function until it reach $y = 1$ and the one that initialized at $y=-1$ stay the same at $-1$.

For me, I consider the numerical result as intuitive one, but Taylay expansion predict otherwise. So from this counter example it clearly show that Taylor expansion can approximate ode solution atleast in locally. It is not alway analytic and some of them is not representable by Taylor series.

M lab
  • 129
  • $y’(x) = \sqrt{x}$. – A rural reader Apr 13 '25 at 17:49
  • @Aruralreader could you elaborate more on this? – M lab Apr 13 '25 at 17:59
  • It is not, like in @Aruralreader example... I got the same doubts and use it as example in this question, which solution it is defined piecewise since a power series cannot become zero forever after some finite extinction time or it will violate the Identity theorem. Also truncated series expansions have consequences as you could see here – Joako Apr 13 '25 at 22:39
  • In my current understanding, it impossible to define IVP problem such that that solution is non-analytic therefore the numerical method based of Taylor expansion is alway valid. If it's not could you provide a counter example? – M lab Apr 14 '25 at 11:27
  • What’s the Taylor/power series expansion of $y$ satisfying $y’(x) = \sqrt{x}$ on $0<x<\epsilon$ subject to $y(0)=0$? – A rural reader Apr 14 '25 at 21:56

1 Answers1

1

Try $$y' = \sqrt{1-y^2} $$ Obviously $y(x)= \sin(x-x_0), \quad y=1, \quad y=-1$ are solutions and any set of curves with a $-1$. a sine between $-1,1$ and a continuation by $+1$ is a continously differentiable solution for any start value $y(x_s) = y_s \in (-1,1)$.

The reason for local existence with non-uniqueness for special y-values lies in the fact that at $y=\pm 1$ there is no Taylor expansion of the square root and the condition for the existence of a unique solution passing through $(x,\pm 1)$ is violated: the Lipshitz condition.

Besides, for systems of first order analyticity in many variables is a not so easy definition as in one complex dimension.

And an ODE of order n for one dependent variable is equivalent to a first order system of n dependent variables by setting the n-1 derivatives of y as pairwise independent variables $y^{(k)}=y_k.$ The meaning of independence here is that each $y^{(k)}=y_k$ needs a start value so that the sulution finally is of the form $$y\left(x,\left(x_0,y'(x_0),\dots , y^{(n-1)}(x_0)\right)\right)$$

Roland F
  • 5,122