Lyapunov Function Search
Adapted from: SOSTOOLS' SOSDEMO2 (See Section 4.2 of SOSTOOLS User's Manual)
using DynamicPolynomials
@polyvar x[1:3]
(DynamicPolynomials.PolyVar{true}[x₁, x₂, x₃],)
We define below the vector field $\text{d}x/\text{d}t = f$
f = [-x[1]^3 - x[1] * x[3]^2,
-x[2] - x[1]^2 * x[2],
-x[3] - 3x[3] / (x[3]^2 + 1) + 3x[1]^2 * x[3]]
3-element Vector{MultivariatePolynomials.RationalPoly{DynamicPolynomials.Polynomial{true, Int64}, DynamicPolynomials.Polynomial{true, Int64}}}:
(-x₁³ - x₁x₃²) / (1)
(-x₁²x₂ - x₂) / (1)
(3x₁²x₃³ + 3x₁²x₃ - x₃³ - 4x₃) / (x₃² + 1)
We need to pick an SDP solver, see here for a list of the available choices. We use SOSModel
instead of Model
to be able to use the >=
syntax for Sum-of-Squares constraints.
using SumOfSquares
using CSDP
solver = optimizer_with_attributes(CSDP.Optimizer, MOI.Silent() => true)
model = SOSModel(solver);
We are searching for a Lyapunov function $V(x)$ with monomials $x_1^2$, $x_2^2$ and $x_3^2$. We first define the monomials to be used for the Lyapunov function:
monos = x.^2
3-element Vector{DynamicPolynomials.Monomial{true}}:
x₁²
x₂²
x₃²
We now define the Lyapunov function as a polynomial decision variable with these monomials:
@variable(model, V, Poly(monos))
\[ (_[1])x_{1}^{2} + (_[2])x_{2}^{2} + (_[3])x_{3}^{2} \]
We need to make sure that the Lyapunov function is strictly positive. We can do this with a constraint $V(x) \ge \epsilon (x_1^2 + x_2^2 + x_3^2)$, let's pick $\epsilon = 1$:
@constraint(model, V >= sum(x.^2))
\[ (_[1] - 1)x_{1}^{2} + (_[2] - 1)x_{2}^{2} + (_[3] - 1)x_{3}^{2} \text{ is SOS} \]
We now compute $\text{d}V/\text{d}x \cdot f$.
using LinearAlgebra # Needed for `dot`
dVdt = dot(differentiate(V, x), f)
\[ \frac{(-2 _[1])x_{1}^{4}x_{3}^{2} + (-2 _[2])x_{1}^{2}x_{2}^{2}x_{3}^{2} + (-2 _[1] + 6 _[3])x_{1}^{2}x_{3}^{4} + (-2 _[1])x_{1}^{4} + (-2 _[2])x_{1}^{2}x_{2}^{2} + (-2 _[1] + 6 _[3])x_{1}^{2}x_{3}^{2} + (-2 _[2])x_{2}^{2}x_{3}^{2} + (-2 _[3])x_{3}^{4} + (-2 _[2])x_{2}^{2} + (-8 _[3])x_{3}^{2}}{x_{3}^{2} + 1} \]
The denominator is $x[3]^2 + 1$ is strictly positive so the sign of dVdt
is the same as the sign of its numerator.
P = dVdt.num
\[ (-2 _[1])x_{1}^{4}x_{3}^{2} + (-2 _[2])x_{1}^{2}x_{2}^{2}x_{3}^{2} + (-2 _[1] + 6 _[3])x_{1}^{2}x_{3}^{4} + (-2 _[1])x_{1}^{4} + (-2 _[2])x_{1}^{2}x_{2}^{2} + (-2 _[1] + 6 _[3])x_{1}^{2}x_{3}^{2} + (-2 _[2])x_{2}^{2}x_{3}^{2} + (-2 _[3])x_{3}^{4} + (-2 _[2])x_{2}^{2} + (-8 _[3])x_{3}^{2} \]
Hence, we constrain this numerator to be nonnegative:
@constraint(model, P <= 0)
\[ (2 _[1])x_{1}^{4}x_{3}^{2} + (2 _[2])x_{1}^{2}x_{2}^{2}x_{3}^{2} + (2 _[1] - 6 _[3])x_{1}^{2}x_{3}^{4} + (2 _[1])x_{1}^{4} + (2 _[2])x_{1}^{2}x_{2}^{2} + (2 _[1] - 6 _[3])x_{1}^{2}x_{3}^{2} + (2 _[2])x_{2}^{2}x_{3}^{2} + (2 _[3])x_{3}^{4} + (2 _[2])x_{2}^{2} + (8 _[3])x_{3}^{2} \text{ is SOS} \]
The model is ready to be optimized by the solver:
JuMP.optimize!(model)
We verify that the solver has found a feasible solution:
JuMP.primal_status(model)
FEASIBLE_POINT::ResultStatusCode = 1
We can now obtain this feasible solution with:
value(V)
\[ 15.718362431619164x_{1}^{2} + 12.28610732110516x_{2}^{2} + 2.995845325502817x_{3}^{2} \]
This page was generated using Literate.jl.