7

It is known that the only solutions to the ODE

$$f'(x)=1+\left[f(x)\right]^2$$

are of the form $f(x)=\tan(c+x)$ (this is also easy to verify by hand). This shows that the differential equation can't have a solution over $\mathbb{R}$ because $\tan (c+x)$ is undefined whenever $c+x=\frac{\pi}{2}+\pi n$ for integer $n$.

But what if I didn't know this? What if I didn't know that

$$f'(x)=1+\left[f(x)\right]^2\iff f(x)=\tan(c+x)$$

Heck, what if I've never even heard of the tangent function nor any of the other trigonometric functions? Presumably, I could prove that there can't be a solution over $\mathbb{R}$ from the ODE alone, but how would I go about doing it?

For the record, I have no idea how to approach this. From the assumption that $f$ is differentiable everywhere, nothing from the equation seems to "break": you get two everywhere-continuous functions, $f'$ and $1+f^2$, and they are equal to each other.

Alann Rosas
  • 6,553
  • To get a feel for what the solutions looked like, I would start by drawing a gradient field for $\frac{dy}{dx} = 1 + y^2$ and trying to plot some solutions, either by hand (if I were on a desert island with some grid paper) or more likely by computer. – Joppy Oct 07 '20 at 07:07
  • You need to specify if you only want to avoid to explicitly solve this specific equation or any differential equation at all. The standard Riccati problems $y'(x)=x+y(x)^2$ or $y'(x)=x^2+y(x)^2$ https://math.stackexchange.com/questions/2348022/riccati-d-e-vertical-asymptotes are perhaps better for that, as they actually do not have a symbolic solution (without invoking special functions). Or something like $y'(x)=e^{-x^2}+y(x)^3$, https://math.stackexchange.com/questions/2622702/the-ivp. – Lutz Lehmann Oct 07 '20 at 08:17

4 Answers4

8

One can often estimate the existence interval by solving a simpler differential equation and obtaining a lower bound for the solution.

In your case: Assume that $f$ solves the differential equation on the interval $[a, b]$ with $f(a) > 0$ (the case $f(a) < 0$ can be handled similarly). Then $f$ is strictly positive on the interval, and $$ f'(x) = 1 + f(x)^2 > f(x)^2 $$ which implies that $$ b - a < \int_a^b \frac{f'(x)}{f(x)^2} \, dx = \frac{1}{f(a)} - \frac{1}{f(b)} < \frac{1}{f(a)} $$ and shows that $b$ cannot be arbitrarily large.

More concretely: Let $f$ be the solution with $f(0) = 0$. Then $f'(x) \ge 1$ for $x \ge 0$ so that $f(1) \ge 1$. Applying the above with $a=1$ shows that $$ b < 1 + \frac{1}{f(1)} \le 2 \, , $$ i.e. no solution exists on the interval $[0, 2]$.

Martin R
  • 128,226
  • Wait, I just thought of something. What if $f(a)<0$? Clearly $f$ is strictly increasing, so how can we be sure that $f$ does not have a root in $[a,b]$? $f(b)$ could be positive or zero (I think), leaving us with an improper integral. – Alann Rosas Oct 07 '20 at 08:05
  • In the case $f(a) < 0$ you consider an interval $[c, a]$ to the left of $a$, where $f$ is strictly negative. Or you consider $g(x) = -f(-x)$ instead of $f$. – Martin R Oct 07 '20 at 08:06
  • Ahh, that makes much more sense. But then again, what if $f(a)=0$? I imagine that's a possibility, so we might still be left with an improper integral. Am I overlooking something?

    P.S. I apologize if I come across as a challenger. This is not my intention.

    – Alann Rosas Oct 07 '20 at 08:13
  • 1
    @A.E.Rosas: No problem. $f(x) \equiv 0$ is not a solution of that ODE, so $f(a)$ must be non-zero for some $a$. Starting at that point you show that the solution does not extend infinitely to the right or to the left (depending on the sign of $f(a)$). – Martin R Oct 07 '20 at 08:16
  • That's such a clever observation! I can't believe that flew over my head! :P

    I got it now. Thank you so much!

    – Alann Rosas Oct 07 '20 at 08:27
5

Suppose $y(x)$ is an everywhere differentiable function which satisfies the differential equation $y'=1+y^2$.

Our goal is to derive a contradiction (without explicitly solving the ODE).

From $y'=1+y^2$, it follows that $y'\ge 1$ for all $x\in\mathbb{R}$, hence $y$ is increasing.

If $y$ is bounded above, then since $y$ is increasing, we would have $$ \lim_{x\to\infty}y'(x)=0 $$ contrary to $y'\ge 1$ for all $x\in\mathbb{R}$.

Similarly, if $y$ is bounded below, then since $y$ is increasing, we would have $$ \lim_{x\to -\infty}y'(x)=0 $$ contrary to $y'\ge 1$ for all $x\in\mathbb{R}$.

Hence the range of $y$ is equal to $\mathbb{R}$.

Let $a\in\mathbb{R}$ be such that $y(a)=1$ and let $b > a$. \begin{align*} \text{Then}\;\;& y'=1+y^2 \\[4pt] \implies\;& \frac{1}{1+y^2}\,dy=dx \\[4pt] \implies\;& \int_{y(a)}^{y(b)}\frac{1}{1+y^2}\,dy=\int_a^b 1\,dx \\[4pt] \implies\;& \int_1^{y(b)}\frac{1}{1+y^2}\,dy=b-a \\[4pt] \implies\;& \lim_{b\to\infty}\left(\int_1^{y(b)}\frac{1}{1+y^2}\,dy\right)=\infty \\[4pt] \implies\;& \int_1^\infty \frac{1}{1+y^2}\,dy=\infty \\[4pt] \end{align*} contradiction, since $$ \int_1^\infty \frac{1}{1+y^2}\,dy < \int_1^\infty \frac{1}{y^2}\,dy = 1 $$

quasi
  • 61,115
1

Let $f$ be a solution with maximal interval of definition. Notice that $f'(x)\ge1$ for all $x$ in the domain of $f$, therefore $f$ is an injective $C^1$ function defined on an interval. From $\frac{f'(x)}{1+(f(x))^2}=1$, we obtain, for some fixed $t_0\in\Bbb R$ and for all $x$ in the domain of $f$, $$\int_{t_0}^x \frac{f'(t)}{1+(f(t))^2}\,dt=x-t_0\\ G(f(x))-G(f(t_0))=x-t_0,$$

where $G(x)=\int_0^x \frac1{1+t^2}\,dt$. Now, thanks to your favourite estimate, we know that $G$ is bounded, and therefore so must be the quantity $x=G(f(x))-G(f(t_0))+t_0$. This bounds the domain of $f$.

1

You could try to find your solution in the form of a fraction $f=\frac{p}{q}$, aiming for nice non-singular functions $p$ and $q$. Then roots of $q$ (with $p$ non-zero there) are poles of the solution, thus limiting the domain of it.

Inserting into the differential equation results in $$ p'q-q'p=q^2+p^2\iff (p'-q)q=p(p+q'). $$ The freedom in defining one relation between $p$ and $q$ allows to extract a nice linear system with globally non-singular solutions \begin{align} p'&=q,\\ q'&=-p. \end{align} Now one could know or easily show that this describes a regular circular motion, starting with showing that $p^2+q^2$ is a constant. It follows that $q$ has indeed periodic roots that are not roots of $p$, so that any solution is only defined on a finite interval.

Lutz Lehmann
  • 131,652
  • This is a beautiful solution Lutz! Does it remain to be shown that every differentiable function can be expressed in the form $\frac{p}{q}$ for differentiable functions $p$ and $q$? – Alann Rosas Oct 07 '20 at 17:40
  • 1
    You could always take $q=1$. Essentially, this is an elementary derivation of the parametrization $f=-\frac{u'}{u}$ or $q=u=\exp(\int f(x)dx)$, so that $p=-u'$. – Lutz Lehmann Oct 07 '20 at 17:56