22

Is Lipschitz's condition necessary condition or sufficient condition for existence of unique solution of an Initial Value Problem ?


I saw in a book that it is sufficient condition. But I want an example to prove it sufficient. That is I want an example of an I.V.P. of the form $$\frac{dy}{dx}=f(x,y)\text{ , with initial condition } y(x_0)=y_0$$ in which $f(x,y)$ does not satisfy Lipchitz's condition although the I.V.P. has an unique solution.

Also , I saw in wikipedia that the I.V.P. $\frac{dy}{dx}=y^{1/3}$ , with initial condition $y(0)=0$ has three solutions. But how we get three solutions ?

When I solve the equation with initial condition then I get , $y=\left(\frac{2}{3}x\right)^{3/2}$

According to uranix's comment , when an I.V.P. has non-unique solution then we can put the solution in the form that given by uranix. So I think non-unique solution implies infinitely many solutions. So from where the question of existence of $2$ or $3$ or $4$ solutions arise ?

I asked for the problem $\frac{dy}{dx}=3y^{2/3}$ with $y(0)=0$ here and the answer of this question says that there are infinitely many solutions.


Now, in my mind following three questions arise :

$(1)$ Example of an I.V.P. in which $f(x,y)$ does not satisfy Lipschitz's condition but the I.V.P. has unique solution.

$(2)$ If an I.V.P. has non-unique solution then can we say that the I.V.P. has infinitely many solutions ?

$(3)$ If answer of $(2)$ is negative then how much solutions exist and how we find them ?

Can anyone help me to understand these properly ?

Empty
  • 13,252
  • I think what you mean is that you want to prove it's strictly sufficient, but a more adequate term would be "not necessary". – GPerez May 09 '15 at 11:45
  • 2
    There exists no necessary condition for the uniqueness. Look up Osgood's thorem. – Artem May 09 '15 at 11:47
  • @S.Panja-1729 Actually, there are infinite number of the solutions to that equation. At least each of the form $y(x) = \pm \left(\frac{2(x-C)}{3} \right)^{2/3}, x > C$, and zero elsewhere is a solution. – uranix May 09 '15 at 11:51
  • Yes...My opinion is the same as you . Then why wikipedia says $3$ solutions ? – Empty May 09 '15 at 11:53
  • 2
    It doesn't say that it has only three solutions – uranix May 09 '15 at 11:56
  • @Artem I can't find that result under the name of Osgood's theorem... can you specify? – GPerez May 09 '15 at 12:14
  • @ uranix ): Read this carefully – Empty May 09 '15 at 13:37
  • @S.Panja-1729 See here. – Artem May 09 '15 at 15:21
  • @ Artem ): It is not my answer...See my updated question...please.. – Empty May 09 '15 at 16:58
  • @S.Panja-1729 This theorem exactly answers your question (1). For an answer to questions (2) and (3) you should pick Hartman, ODE. – Artem May 09 '15 at 19:33
  • How it answers 1st question ? It gives only the condition of Lipschitz...but not give any example..I want an example...@ Artem – Empty May 10 '15 at 00:20
  • I think that what they mean was "three type of soultions"---as those written by @uranix. But then you can "glue" such solutions. The key point here is that after performing such a gluing (in this particular example) you obtain differentiable function, which in general is not true. – truebaran May 12 '15 at 18:41
  • How you conclude that there are three solutions ? – Empty May 12 '15 at 18:43

4 Answers4

21

Answer to Question 1. Lipschitz is sufficient but not necessary.

Fact I. Lipschitz condition is sufficient for uniqueness. This is a consequence of Picard-Lindelöf Theorem.

In particular, if $f:D\to\mathbb R^n$, is continuous, where $D\subset \mathbb R^{n+1}$, open, and $f=f(t,\boldsymbol{x})$, where $t\in \mathbb R$ and $\boldsymbol{x}\in \mathbb R^n$, and $f$ is locally Lipschitz with respect to $\boldsymbol{x}$, i.e., for every compact $K\subset D$, there exists an $L_K>0$, such that, for every $(t,\boldsymbol{x}),(t,\boldsymbol{x}_2)\in K$, $$ \lvert\,\boldsymbol{f}(t,\boldsymbol{x}_1)-\boldsymbol{f}(t,\boldsymbol{x}_2)\rvert \le L_K\lvert\boldsymbol{x}_1-\boldsymbol{x}_2\rvert, $$ then the IVP $$\,\boldsymbol{x}'=\boldsymbol{f}(t,\boldsymbol{x}), \quad\,\boldsymbol{x}(\tau)=\boldsymbol{\xi},\,$$ possesses a unique solution for every $(\tau,\boldsymbol{\xi})\in D$.

Fact II. Lipschitz condition is not necessary for uniqueness.

Take for example the IVP $$ x'=f(x), \quad x(\tau)=\xi,\tag{1} $$ where $f:\mathbb R\to\mathbb R$ is just continuous, and positive, i.e., $f(x)>0$, for all $x\in\mathbb R$. Then $(1)$ enjoys uniqueness for all $(\tau,\xi)\in\mathbb R$. To see this define $$ F(x)=\tau+\int_\xi^x\frac{ds}{f(s)}. $$ Then $F:\mathbb R\to (A_-,A_+)$ is one-to-one and onto, where $A_\pm=\lim_{x\to\pm\infty}F(x)$. Also $F$ is continuously differentiable and strictly increasing as $F'(x)>0$, and hence $F$ possesses a continuously differentiable inverse $\varphi : (A_-,A_+)\to\mathbb R$. Clearly, $\varphi(\tau)=F^{-1}(\tau)=\xi$ and $$ \varphi'(t)=\frac{1}{F'\big(F^{-1}(t)\big)}=f\big(F^{-1}(t)\big)=f\big(\varphi(t)\big) $$ Hence $\varphi$ is a solution of $(1)$. Let $\psi: I\to\mathbb R$ be another solution of $(1)$, where $I$ is an open interval containing $\tau$. Then $$ 1=\frac{\psi'(t)}{f\big(\psi(t)\big)}=\Big(F\big(\psi(t)\big)\Big)^{\!\prime}, \quad \text{for all $t\in I$}. $$ Thus $$ t+c=F\big(\psi(t)\big), $$ and for $t=\tau$, $$ \tau+c=F\big(\psi(\tau)\big)=F(\xi)=\tau. $$ Hence $c=0$ and thus $$ t=F\big(\psi(t)\big)=F\big(\varphi(t)\big), $$ and as $F$ is one-to-one, then $\varphi\equiv\psi$.

So, uniqueness is obtainable even without assuming Lipschitz condition!

Answer to Question 2. If uniqueness is violated, then there are infinitely many solutions. In fact, a continuum of solutions. This is a consequence of Helmut Kneser's Theorem, which can be found in Hartman's ODEs book, page 15.

A simple proof of the 1-dimensional case can be found here.

  • Ok...It is sufficient condition..But when we get finitely many solutions ($>1$) ? – Empty May 17 '15 at 15:37
  • @S.Panja-1729 See my updated answer. We get infinitely many solutions, if uniqueness is violated! – Yiorgos S. Smyrlis May 17 '15 at 15:59
  • So, in your opinion, if an I.V.P. has non- unique solution then the I.V.P. has infinitely many solutions ?? Is it ?? My confusion is there...Someone tell that there may or may not be infinitely many solutions..There may be two or three solutions.. – Empty May 17 '15 at 16:08
  • Please see the answer of 'Beni Bogosel' ..He says that if the I.V.P. has two solutions and $f$ is continuous then there are infinitely many solutions.. – Empty May 17 '15 at 16:13
  • @S.Panja-1729: Beni Bogosel assertion is correct. That's what I am saying. He is proving a special case of "H. Kneser Theorem". – Yiorgos S. Smyrlis May 17 '15 at 16:30
  • So, there does not exist finitely many solutions(>1) ? – Empty May 17 '15 at 16:34
  • @S.Panja-1729: No, they are infinitely many solutions. This is a famous Theorem of 1924! – Yiorgos S. Smyrlis May 17 '15 at 16:42
  • Ok...Now it is clear to me...Can you give please an example of an I.V.P. in which $f$ does not satisfy Lipschitz's condition but the I.V.P. has unique solution ? – Empty May 17 '15 at 16:49
  • Example. $x'=1+\lvert x\rvert^{1/2},,,x(0)=0$. Here $f(x)=1+\lvert x\rvert^{1/2}$ does not satisfy Lipschitz at $x=0$, but the IVP possesses a unique solution. The proof is under my answer, after the Answer to Question 1. – Yiorgos S. Smyrlis May 17 '15 at 16:55
  • Is the proof in the fact II grönwall's inequality theorem? – Tutusaus Nov 04 '22 at 12:43
2

Take

$$F(x,y) = \begin{cases} -y\log y &\text{for $y\in (0,1)$}\\ 0&\text{for $y=0$} \end{cases}$$

in any rectangle where it makes sense.

Then for $c\in (0,1)$ the problem $$\begin{cases} y' = F(x,y)\\ y(c) = 0\end{cases}$$ has exactly one solution. But $F$ is not Lipschitz.

The answer to your other two questions is: it depends of the IVP.

That means you can find problems with an infinity number of solutions, and there are problems with a finite number of solutions such as

$$\begin{cases} (f'(x))^2 = x\\ f(0) = 0 \end{cases}$$ say $x\in [0,1]$.

You can find sufficient conditions like under these circumstances, there are $n$ solutions, but I doubt something general can be said.

leo
  • 10,769
  • Thanks for your attention to my question...But how we check whether there are infinitely many solutions or finite(>1) solutions ? – Empty May 10 '15 at 17:34
  • 1
    Another question :) How you can say that the problem $$\begin{cases} (f'(x))^2 = x\ f(0) = 0 \end{cases}$$ say $x\in [0,1]$. have finite number of solutions ? – Empty May 10 '15 at 18:55
  • Another question)): In your example, $$\frac{dy}{dx}=-y\log y$$with initial condition $y(0)=0$ how you argue that it has exactly one solution ? I solve the problem and I get , $\log y=Ce^{-x}$. Then with initial condition I can not eliminate the constant $C$. So I think there are infinitely many solutions..Where my mistake ?? – Empty May 11 '15 at 03:31
  • To your first question: that problem, for $x\in[0,1]$ that problem is equivalent to two IVP. To your second question there's a typo I'm about to correct. – leo May 12 '15 at 00:55
1

Answer for $2$: If an IVP has two solutions and $f$ is continuous (not necessarily Lipschitz), then it has infinitely many solutions. I present a proof below, but the rough idea is as follows:

  • Suppose there are at least two solutions, therefore, there is a point $a\neq x_0$ and two solutions $y_1,y_2$ such that $y_1(a)\neq y_2(a)$.

  • Take any point $(a,b)$ on the segment $(a,y_1(a)), (a,y_2(a))$, and consider the IVP with initial condition $y(a)=b$.

  • If this new solution does not intersect other solutions, then its OK. If this solutions intersects another one, then note that at the intersection point we can choose either branch we want, and it will still be a solution (mean value theorem).

  • thus, for every point on the considered segment, we can find a solution, that can be extended to the whole maximal

Details follow.

Problem. Suppose $ f:\mathbb{R}^2 \to \Bbb{R}$ is continuous and $ t_0, x_0 \in \Bbb{R}$. Prove that if the Cauchy Problem $ \begin{cases} \dot{x}=f(t,x) \\ x(t_0)=x_0 \end{cases} $ has two distinct solutions then it has infinitely many solutions.

Proof: Without loss of generality we may assume that $ x_0=t_0=0$. Then there exist two solutions $ x_1,x_2$ of the Cauchy problem such that they are different in a point $ a$ which we may assume is greater than $ 0$. Therfore assume $ x_1(a) < x_2(a)$ and denote $ P(a,h)$ a point on the segment $ a \times (x_1(a),x_2(a))$ we can extend $ x$ towards $ 0$. From Cauchy's existence theorem, we can see that there exists a solution $ x$ around $ P$ of the differential equation $ \begin{cases} \dot{x}=f(t,x) \\ x(a)=h \end{cases}$. Denote by $ K$ the compact determined by $ x_1,x_2$ and the line $ x=a$. Since $ x$ is in that compact in a left neighborhood of $ a$, by the compact extension theorem, it can be extended until it reaches the boundary of $ K$. From the intersection point of the graph of $ x$ with the boundary of $ K$ we can go on the graph of $ x_1$ or $ x_2$ until we reach $ (0,0)$ and by the corollary of the mean value theorem the graph we choose is the graph of a solution to the initial differential equation.

Thus, we can find a solution $ x_h$ for every $ h \in (x_1(a),x_2(a))$, and therefore the intial equation has uncountably infinitely many solutions.

Beni Bogosel
  • 23,891
  • Ok..You stated & proved that if an I.V.P. has two solutions and $f$ is continuous then the I.V.P. has uncountably infinitely many solutions..Here two solutions means two non-trivial solutions ? Or, it may one trivial solution $y=0$ and the other is non-trivial ? – Empty May 16 '15 at 08:35
  • 1
    Two solutions means exactly that. One may be trivial, there's no problem. – Beni Bogosel May 16 '15 at 11:18
  • 1
    So, for the problem $\frac{dy}{dx}=y^{1/3}$ with initial condition $y(0)=0$ there are two solutions: $y=0$ and $y=\left(\frac{2}{3}x\right)^{3/2}$. Also the function $f(x,y)=y^{1/3}$ is continuous.. So this problem has infinitely many solutions..So , is wikipedia wrong ? – Empty May 16 '15 at 12:41
  • Another question ): Is your condition sufficient only? – Empty May 16 '15 at 12:45
  • @S.Panja-1729: Wikipedia is only as right as the people contributing to it. So the question "is wikipedia wrong?" can well be answered "Yes" in plenty of situations. You can see in the comments to your questions that the equation you mention has infinitely many solutions. – Beni Bogosel May 16 '15 at 13:31
  • I do not get the question regarding the sufficiency. Can you please detail what you mean? – Beni Bogosel May 16 '15 at 13:33
  • Is the continuity condition sufficient ? – Empty May 17 '15 at 15:29
  • Continuity alone is not sufficient. Continuity plus two different solutions give an infinity of solutions. Think that $f$ Lipschitz (which is continuous) implies unique solution. – Beni Bogosel May 17 '15 at 16:06
1

Answer to the first part of your question regarding $\frac {dy}{dx} =y ^{1/3}$:
The answer given by uranix is indeed correct. The solution to the ODE is $y(x)=\Biggl\{\Bigl({2(x-c)\over3}\Bigr)^{2/3},x\geq c\Biggl\}$ and $y(x)= \Bigl\{0, x\lt c\Bigr\}$. As the value of c changes, you get infinitely many solutions. Note that there are not $2$ or $3$ or $4$ solutions, but infinite.

Answer to your subsequent questions:
$(1)$ Lipschitz condition is not necessary for a unique solution. It is a sufficient one. An example is the IVP $$\frac{dy}{dx}=\frac{1}{y^2}\quad,\quad y(x_0)=0$$ This does not satisfy the Lipschitz condition, but a unique solution can be given as $y(x)=(3(x-x_0))^{1/3}$ exists for every point on the X-axis.

$(2)$ The obvious answer is yes. Any IVP with non-unique solutions($\gt0$), has infintely many solutions.If an IVP, with f is continuous (not necessarily Lipschitz), has more than one solution, then it has uncountably many solutions. This is also known as Kneser’s Theorem.

Hyperion
  • 75
  • 6