Advanced Features
DCP Errors
When a problem is solved that involves an expression which is not of DCP form, an error is emitted. For example,
julia> using Convex, SCS
julia> x = Variable();
julia> y = Variable();
julia> p = minimize(log(x) + square(y), [x >= 0, y >= 0]);
julia> solve!(p, SCS.Optimizer; silent = true)
┌ Warning: Problem not DCP compliant: objective is not DCP
└ @ Convex ~/.julia/dev/Convex/src/problems.jl:73
ERROR: DCPViolationError: Expression not DCP compliant. This either means that your problem is not convex, or that we could not prove it was convex using the rules of disciplined convex programming. For a list of supported operations, see https://jump.dev/Convex.jl/stable/operations/. For help writing your problem as a disciplined convex program, please post a reproducible example on https://jump.dev/forum.
Stacktrace:
[...]
See Extended formulations and the DCP ruleset for more discussion on why these errors occur.
Dual Variables
Convex.jl also returns the optimal dual variables for a problem. These are stored in the dual
field associated with each constraint.
julia> using Convex, SCS
julia> x = Variable();
julia> constraint = x >= 0;
julia> p = minimize(x, [constraint]);
julia> solve!(p, SCS.Optimizer; silent = true)
Problem statistics problem is DCP : true number of variables : 1 (1 scalar elements) number of constraints : 1 (1 scalar elements) number of coefficients : 1 number of atoms : 1 Solution summary termination status : OPTIMAL primal status : FEASIBLE_POINT dual status : FEASIBLE_POINT objective value : 0.0 Expression graph minimize └─ real variable (id: 334…873) subject to └─ ≥ constraint (affine) └─ + (affine; real) ├─ real variable (id: 334…873) └─ [0;;]
julia> constraint.dual
0.999999976532818
Warmstarting
If you're solving the same problem many times with different values of a parameter, Convex.jl can initialize many solvers with the solution to the previous problem, which sometimes speeds up the solution time. This is called a warm start.
To use this feature, pass the optional argument warmstart=true
to the solve!
method.
using Convex, SCS
n = 1_000
y = rand(n);
x = Variable(n)
lambda = Variable(Positive())
fix!(lambda, 100)
problem = minimize(sumsquares(y - x) + lambda * sumsquares(x - 10))
@time solve!(problem, SCS.Optimizer)
# Now warmstart. If the solver takes advantage of warmstarts, this run will be
# faster
fix!(lambda, 105)
@time solve!(problem, SCS.Optimizer; warmstart = true)
Fixing and freeing variables
Convex.jl allows you to fix a variable x
to a value by calling the fix!
method. Fixing the variable essentially turns it into a constant. Fixed variables are sometimes also called parameters.
fix!(x, v)
fixes the variable x
to the value v
.
fix!(x)
fixes x
to its current value, which might be the value obtained by solving another problem involving the variable x
.
To allow the variable x
to vary again, call free!(x)
.
Fixing and freeing variables can be particularly useful as a tool for performing alternating minimization on nonconvex problems. For example, we can find an approximate solution to a nonnegative matrix factorization problem with alternating minimization as follows. We use warmstarts to speed up the solution.
n, k = 10, 1
A = rand(n, k) * rand(k, n)
x = Variable(n, k)
y = Variable(k, n)
problem = minimize(sum_squares(A - x * y), [x >= 0, y >= 0])
# initialize value of y
set_value!(y, rand(k, n))
# we'll do 10 iterations of alternating minimization
for i in 1:10
# first solve for x. With y fixed, the problem is convex.
fix!(y)
solve!(problem, SCS.Optimizer; warmstart = i > 1 ? true : false)
# Now solve for y with x fixed at the previous solution.
free!(y)
fix!(x)
solve!(problem, SCS.Optimizer; warmstart = true)
free!(x)
end
Custom Variable Types
By making subtypes of Convex.AbstractVariable
that conform to the appropriate interface (see the Convex.AbstractVariable
docstring for details), one can easily provide custom variable types for specific constructions. These aren't always necessary though; for example, one can define the following function probabilityvector
:
using Convex
function probability_vector(d::Int)
x = Variable(d, Positive())
add_constraint!(x, sum(x) == 1)
return x
end
probability_vector (generic function with 1 method)
and then use, say, p = probabilityvector(3)
in any Convex.jl problem. The constraints that the entries of p
are non-negative and sum to 1 will be automatically added to any problem p
is used in.
Custom types are necessary when one wants to dispatch on custom variables, use them as callable types, or provide a different implementation. Continuing with the probability vector example, let's say we often use probability vectors variables in taking expectation values, and we want to use function notation for this. To do so, we define:
julia> using Convex
julia> mutable struct ProbabilityVector <: Convex.AbstractVariable head::Symbol size::Tuple{Int,Int} value::Union{Convex.Value,Nothing} vexity::Convex.Vexity function ProbabilityVector(d) return new(:ProbabilityVector, (d, 1), nothing, Convex.AffineVexity()) end end
julia> Convex.get_constraints(p::ProbabilityVector) = [ sum(p) == 1 ]
julia> Convex.sign(::ProbabilityVector) = Convex.Positive()
julia> Convex.vartype(::ProbabilityVector) = Convex.ContVar
julia> (p::ProbabilityVector)(x) = dot(p, x)
Custom variable types must be mutable
, otherwise variables with the same size and value would be treated as the same object.
Then one can call p = ProbabilityVector(3)
to construct a our custom variable which can be used in Convex, which already encodes the appropriate constraints (non-negative and sums to 1), and which can act on constants via p(x)
. For example,
julia> using SCS
julia> p = ProbabilityVector(3)
Variable size: (3, 1) sign: positive vexity: affine id: 614…455
julia> prob = minimize(p([1.0, 2.0, 3.0]))
Problem statistics problem is DCP : true number of variables : 1 (3 scalar elements) number of constraints : 1 (1 scalar elements) number of coefficients : 4 number of atoms : 4 Solution summary termination status : OPTIMIZE_NOT_CALLED primal status : NO_SOLUTION dual status : NO_SOLUTION Expression graph minimize └─ sum (affine; positive) └─ .* (affine; positive) ├─ 3-element positive variable (id: 614…455) └─ [1.0; 2.0; 3.0;;] subject to └─ == constraint (affine) └─ + (affine; real) ├─ sum (affine; positive) │ └─ … └─ [-1;;]
julia> solve!(prob, SCS.Optimizer; silent = false);
[ Info: [Convex.jl] Compilation finished: 0.16 seconds, 5.896 MiB of memory allocated ------------------------------------------------------------------ SCS v3.2.3 - Splitting Conic Solver (c) Brendan O'Donoghue, Stanford University, 2012 ------------------------------------------------------------------ problem: variables n: 3, constraints m: 4 cones: z: primal zero / dual free vars: 1 l: linear vars: 3 settings: eps_abs: 1.0e-04, eps_rel: 1.0e-04, eps_infeas: 1.0e-07 alpha: 1.50, scale: 1.00e-01, adaptive_scale: 1 max_iters: 100000, normalize: 1, rho_x: 1.00e-06 acceleration_lookback: 10, acceleration_interval: 10 lin-sys: sparse-direct-amd-qdldl nnz(A): 6, nnz(P): 0 ------------------------------------------------------------------ iter | pri res | dua res | gap | obj | scale | time (s) ------------------------------------------------------------------ 0| 1.94e+01 3.06e+00 4.02e+01 -1.60e+01 1.00e-01 4.99e-05 50| 1.09e-04 7.07e-05 1.94e-04 1.00e+00 1.00e-01 7.51e-05 ------------------------------------------------------------------ status: solved timings: total: 7.58e-05s = setup: 3.83e-05s + solve: 3.75e-05s lin-sys: 7.83e-06s, cones: 4.36e-06s, accel: 2.22e-06s ------------------------------------------------------------------ objective = 1.000168 ------------------------------------------------------------------
julia> evaluate(p)
3-element Vector{Float64}: 0.9998127159848165 0.00010875388968836632 7.823604474042554e-5
Subtypes of AbstractVariable
must have the fields head
and size
. Then they must also
- either have a field
value
, or implementConvex._value
andConvex.set_value!
- either have a field
vexity
, or implementConvex.vexity
andConvex.vexity!
(though the latter is only necessary if you wish to supportConvex.fix!
andConvex.free!
) - have a field
constraints
or implementConvex.get_constraints
(optionally implementConvex.add_constraint!
to be able to add constraints to your variable after its creation), - either have a field
sign
or implementConvex.sign
, and - either have a field
vartype
, or implementConvex.vartype
(optionally, implementConvex.vartype!
to be able to change a variable'svartype
after construction).
Printing and the tree structure
A Convex problem is structured as a tree, with the root being the problem object, with branches to the objective and the set of constraints. The objective is an AbstractExpr
which itself is a tree, with each atom being a node and having children
which are other atoms, variables, or constants. Convex provides children
methods from AbstractTrees.jl so that the tree-traversal functions of that package can be used with Convex.jl problems and structures. This is what allows powers the printing of problems, expressions, and constraints. The depth to which the tree corresponding to a problem, expression, or constraint is printed is controlled by the global variable Convex.MAXDEPTH
, which defaults to 3. This can be changed by for example, setting
Convex.MAXDEPTH[] = 5
Likewise, Convex.MAXWIDTH
, which defaults to 15, controls the "width" of the printed tree. For example, when printing a problem with 20 constraints, only the first MAXWIDTH
of the constraints will be printed. Vertical dots, ⋮
, will be printed indicating that some constraints were omitted in the printing.
A related setting is Convex.MAXDIGITS
, which controls printing the internal IDs of atoms: if the string representation of an ID is longer than double the value of MAXDIGITS
, then it is shortened by printing only the first and last MAXDIGITS
characters.
The AbstractTrees methods can also be used to analyze the structure of a Convex.jl problem. For example,
using Convex, AbstractTrees
x = Variable()
p = maximize( log(x), x >= 1, x <= 3 )
for leaf in AbstractTrees.Leaves(p)
println("Here's a leaf: $(summary(leaf))")
end
Here's a leaf: real variable (id: 125…131)
Here's a leaf: real variable (id: 125…131)
Here's a leaf: negative constant
Here's a leaf: real variable (id: 125…131)
Here's a leaf: negative constant
We can also iterate over the problem in various orders. The following descriptions are taken from the AbstractTrees.jl docstrings, which have more information.
PostOrderDFS
Iterator to visit the nodes of a tree, guaranteeing that children will be visited before their parents.
for (i, node) in enumerate(AbstractTrees.PostOrderDFS(p))
println("Here's node $i via PostOrderDFS: $(summary(node))")
end
Here's node 1 via PostOrderDFS: real variable (id: 125…131)
Here's node 2 via PostOrderDFS: log (concave; real)
Here's node 3 via PostOrderDFS: real variable (id: 125…131)
Here's node 4 via PostOrderDFS: negative constant
Here's node 5 via PostOrderDFS: + (affine; real)
Here's node 6 via PostOrderDFS: ≥ constraint (affine)
Here's node 7 via PostOrderDFS: real variable (id: 125…131)
Here's node 8 via PostOrderDFS: negative constant
Here's node 9 via PostOrderDFS: + (affine; real)
Here's node 10 via PostOrderDFS: ≤ constraint (affine)
Here's node 11 via PostOrderDFS: 2-element Vector{Constraint}
Here's node 12 via PostOrderDFS: Problem{Float64} (concave; real)
PreOrderDFS
Iterator to visit the nodes of a tree, guaranteeing that parents will be visited before their children.
for (i, node) in enumerate(AbstractTrees.PreOrderDFS(p))
println("Here's node $i via PreOrderDFS: $(summary(node))")
end
Here's node 1 via PreOrderDFS: Problem{Float64} (concave; real)
Here's node 2 via PreOrderDFS: log (concave; real)
Here's node 3 via PreOrderDFS: real variable (id: 125…131)
Here's node 4 via PreOrderDFS: 2-element Vector{Constraint}
Here's node 5 via PreOrderDFS: ≥ constraint (affine)
Here's node 6 via PreOrderDFS: + (affine; real)
Here's node 7 via PreOrderDFS: real variable (id: 125…131)
Here's node 8 via PreOrderDFS: negative constant
Here's node 9 via PreOrderDFS: ≤ constraint (affine)
Here's node 10 via PreOrderDFS: + (affine; real)
Here's node 11 via PreOrderDFS: real variable (id: 125…131)
Here's node 12 via PreOrderDFS: negative constant
StatelessBFS
Iterator to visit the nodes of a tree, guaranteeing that all nodes of a level will be visited before their children.
for (i, node) in enumerate(AbstractTrees.StatelessBFS(p))
println("Here's node $i via StatelessBFS: $(summary(node))")
end
Here's node 1 via StatelessBFS: Problem{Float64} (concave; real)
Here's node 2 via StatelessBFS: log (concave; real)
Here's node 3 via StatelessBFS: 2-element Vector{Constraint}
Here's node 4 via StatelessBFS: real variable (id: 125…131)
Here's node 5 via StatelessBFS: ≥ constraint (affine)
Here's node 6 via StatelessBFS: ≤ constraint (affine)
Here's node 7 via StatelessBFS: + (affine; real)
Here's node 8 via StatelessBFS: + (affine; real)
Here's node 9 via StatelessBFS: real variable (id: 125…131)
Here's node 10 via StatelessBFS: negative constant
Here's node 11 via StatelessBFS: real variable (id: 125…131)
Here's node 12 via StatelessBFS: negative constant