7

Consider the system $$ \begin{aligned} \dot{x}_1 &= -x_1+x_2,\\ \dot{x}_2 &= -x_2^3. \end{aligned} $$ The origin is obviously globally asymptotically stable. Indeed, $x_2\to 0$ and the dynamics of $x_1$ is linear and thus converges to zero for the converging input $x_2$.

However, I got stuck trying to find a Lyapunov function for this system. Could you please help?

Arastas
  • 2,443

3 Answers3

5

Define $v=x_1-x_2$. Then $\dot{v} = -x_1+x_2+x_2^3 = -v+x_2^3$. Consider the system $$ \begin{aligned} \dot{v} &= -v + x_2^3, \\ \dot{x}_2 &= -x_2^3. \end{aligned} $$ Choose $V = v^2+\frac{1}{2}x_2^4$. Then $$\dot{V} = -2v^2 + 2vx_2^3-2x_2^6.$$ Since $2vx_2^3 \le v^2+x_2^6$ we have $$\dot{V}\le -v^2-x_2^6.$$ Thus the origin $(v,x_2)=(0,0)$ is globally asymptotically stable, and the same holds for $(x_1,x_2)$.

Arastas
  • 2,443
1

Best I could find is using SOSTOOLS (see also the manual, especially chapter 4.2), but its not a "nice" solution:

$$ V(x) = v(x)^TP v(x) $$

$$ v(x) = \begin{pmatrix} x_1 \\ x_2 \\ x_1^2 \\ x_1 x_2 \\ x_2^2 \end{pmatrix} $$

$$ P = \begin{pmatrix} 0.2270 & -0.2270 & 0.0000 & 0.0000 & -0.0000 \\ -0.2270 & 0.9522 & -0.0000 & -0.0000 &0.0000 \\ 0.0000 & -0.0000 & 0.3215 & -0.0000 & -0.0278 \\ 0.0000 & -0.0000 & -0.0000 & 0.5520 & 0.0293 \\ -0.0000 & 0.0000 & -0.0278 & 0.0293 & 0.3014 \end{pmatrix} $$

At least $P > 0$ and $\dot{V}(x) \leq 0$. So it should be a Lyapunov function, but unfortunatly not enough for asymptotic stability...

EDIT: If you include not the full monomials up to fourth degree but only

$$ m(x) = \begin{pmatrix} x_1^2 \\ x_2^2 \\ x_1 x_2 \\ x_2^4 \end{pmatrix} $$

for the optimization then SOSTOOLS computes

$$ \begin{align} V(x) &= (0.51201 x_1 - 0.21772 x_2)^2 + (0.21772 x_1 - 0.90981 x_2)^2 + 0.41052 x_2^4 \\ &= 0.30955 x_1^2 - 0.61911 x_1 x_2 + 0.41052 x_2^4 + 0.87515 x_2^2 \end{align} $$

and

$$ \begin{align} \dot{V}(x) = &-(-1.268 x_2^3 + 0.00000073 x_1 x_2^2 - 0.1309082 x_2 + 0.130908 x_1)^2 \\ &-(0.1309082 x_2^3 + 0.0000034 x_1 x_2^2 + 0.5486238 x_2 - 0.5486187 x_1)^2 \\ &-(0.130908 x_2^3 + 0.0000036 x_1 x_2^2 + 0.5486187 x_2 - 0.5486244 x_1)^2 \\ &- (1.063507 x_2^2 + 0.00006736922 x_1 x_2)^2 \\ &- (0.00000073 x_2^3 - 0.0000019 x_1 x_2^2 - 0.0000034 x_2 + 0.0000036 x_1)^2 \\ &- (0.00006736922 x_2^2 + 0.002748198 x_1 x_2)^2 \end{align} $$

If you scale $V$ this is just:

$$ \frac{V(x)}{0.30955} = x_1^2 - 2 x_1 x_2 + 1.326 x_2^4 + 2.827 x_2^2 $$

Your solution is:

$$ V(x) = x_1^2 - 2 x_1 x_2 + \frac{1}{2} x_2^4 + x_2^2 $$

So it finds a somewhat similar solution but the $\dot{V}$ is of course quite horrendous.

SampleTime
  • 3,651
  • Sorry, what is SOS-tool? How do you check that $\dot{V}\le 0$? And, more precisely, what is the set $\Omega={x:\dot{V}(x)=0}$? – Arastas Jan 08 '21 at 23:36
  • 1
    @Arastas I updated the answer with a link. $\dot{V} \leq 0$ because $-\dot{V}$ is a sum of squares. The set $\Omega$ you mention is exactly the problem (this is why I wrote it is "not enough for asymptotic stability") because this set seems to be quite complicated (i.e. too complicated to use invariance arguments). – SampleTime Jan 09 '21 at 14:01
  • @Arastas Especially the system on the bottom of page 36 of the linked manual is very similar to your system. But for some reason it doesn't seem to find your much easier solution for your system, which is interesting. – SampleTime Jan 09 '21 at 14:14
  • thanks, it looks interesting! – Arastas Jan 09 '21 at 20:21
  • 1
    Maybe, the SOS tool searches for a quadratic Lyapunov function via the constrain $V(x)-\epsilon (x_1^2+x_2^2)\ge 0$? Then it is not the case for my solution. – Arastas Jan 09 '21 at 20:25
  • 1
    @Arastas I updated the answer after some more tests. If you use a reduced set of monomials it finds a similar solution but with much more complicated $\dot{V}$. – SampleTime Jan 09 '21 at 21:38
1

When $x_2$ gets close to the origin its derivative very quickly becomes very small, mean while the dynamics of $x_1$ drives $x_1$ closer to $x_2$. So $x_1$ should be driven to some manifold defined by $x_2$. In order to also utilize this behavior for the Lyapunov function one could define it as

$$ V(x) = \frac{1}{2} (x_1 - f(x_2))^2 + \frac{1}{2} x_2^2. \tag{1} $$

Taking the time time derivative of this yields

$$ \dot{V}(x) = (x_1 - f(x_2)) \left(-x_1 + x_2 + \frac{d\,f(x_2)}{d\,x_2} x_2^3\right) - x_2^4. \tag{2} $$

If there exists a function $f(x)$ such that

$$ x + \frac{d\,f(x)}{d\,x} x^3 = f(x), \tag{3} $$

then $(2)$ can also be written as

$$ \dot{V}(x) = (x_1 - f(x_2)) \left(-x_1 + f(x_2)\right) - x_2^4 = -(x_1 - f(x_2))^2 - x_2^4, \tag{4} $$

which would be negative definite.

The equation from $(3)$ is equivalent to the following differential equation in $f(x)$

$$ \frac{d\,f(x)}{d\,x} = \frac{f(x) - x}{x^3}, \tag{5} $$

which can be written into a form that is easier to solve by using an integrating factor of $e^{1/(2\,x^2)}$, yielding

\begin{align} \frac{d}{d\,x}\left(e^{\frac{1}{2\,x^2}}\,f(x)\right) &= -\frac{e^{\frac{1}{2\,x^2}}}{x^2}, \tag{6a} \\ e^{\frac{1}{2\,x^2}}\,f(x) &= -\int \frac{e^{\frac{1}{2\,x^2}}}{x^2} dx. \tag{6b} \end{align}

The integral from $(6b)$ can be shown to equal

$$ \int \frac{e^{\frac{1}{2\,x^2}}}{x^2} dx = -\sqrt{\frac{\pi}{2}}\, \text{erfi}\!\left(\frac{1}{\sqrt{2}\,x}\right) + c. \tag{7} $$

Therefore, the general solution for $f(x)$ can be expressed as

$$ f(x) = e^{\frac{-1}{2\,x^2}} \left(\sqrt{\frac{\pi}{2}}\, \text{erfi}\!\left(\frac{1}{\sqrt{2}\,x}\right) + c\right). \tag{8} $$


Since $(8)$ contains expressions which are not very common (mainly the imaginary error function)/ Therefore, I also tried if I could approximate it with a "simpler" expression and I got a decent approximation for $c=0$ using

$$ f(x) \approx \frac{x + 5.506\,x^3 + 1.012\,x^5}{1 + 2.175\,x^2 + 6.107\,x^4 + x^6}. \tag{9} $$

The expression from $(8)$ is not well defined near $x=0$ (although the limit does exist) but the expression from $(9)$ does not suffer from this. However, I have not yet been able to show that the Lyapunov function is also negative definite using this expression for $f(x)$.