Convex Optimization in Julia

Madeleine Udell | ISMP 2015

Convex.jl team

  • Convex.jl: Madeleine Udell, Karanveer Mohan, David Zeng, Jenny Hong

Collaborators/Inspiration:

  • CVX: Michael Grant, Stephen Boyd
  • CVXPY: Steven Diamond, Eric Chu, Stephen Boyd
  • JuliaOpt: Miles Lubin, Iain Dunning, Joey Huchette

initial package installation

Make the Convex.jl module available

using Convex, SparseArrays, LinearAlgebra
using SCS # first order splitting conic solver [O'Donoghue et al., 2014]

Generate random problem data

m = 50;
n = 100;
A = randn(m, n)
x♮ = sprand(n, 1, 0.5) # true (sparse nonnegative) parameter vector
noise = 0.1 * randn(m)    # gaussian noise
b = A * x♮ + noise      # noisy linear observations
50×1 Matrix{Float64}:
 -6.021565529578327
  0.4862497893470268
  1.1660980056235264
  4.490736756364103
  5.251290917775126
 -4.5714554579907
 -3.002120988244739
 -2.2829575237564588
  3.21402710163009
  2.9665701034562697
  ⋮
  6.883109824569101
  0.5011905912894424
  1.2741470859779214
  2.4085866355099887
 -1.610486215207384
  1.7996228043647744
 -4.758465080400447
 -0.7828865793142671
 -3.0420623369536117

Create a (column vector) variable of size n.

x = Variable(n)
Variable
size: (100, 1)
sign: real
vexity: affine
id: 120…649

nonnegative elastic net with regularization

λ = 1
μ = 1
problem = minimize(
    square(norm(A * x - b)) + λ * square(norm(x)) + μ * norm(x, 1),
    x >= 0,
)
Problem statistics
  problem is DCP         : true
  number of variables    : 1 (100 scalar elements)
  number of constraints  : 1 (100 scalar elements)
  number of coefficients : 5_155
  number of atoms        : 14

Solution summary
  termination status : OPTIMIZE_NOT_CALLED
  primal status      : NO_SOLUTION
  dual status        : NO_SOLUTION

Expression graph
  minimize
   └─ + (convex; positive)
      ├─ qol_elem (convex; positive)
      │  ├─ norm2 (convex; positive)
      │  │  └─ …
      │  └─ [1.0;;]
      ├─ * (convex; positive)
      │  ├─ [1;;]
      │  └─ qol_elem (convex; positive)
      │     ├─ …
      │     └─ …
      └─ * (convex; positive)
         ├─ [1;;]
         └─ sum (convex; positive)
            └─ …
  subject to
   └─ ≥ constraint (affine)
      └─ + (affine; real)
         ├─ 100-element real variable (id: 120…649)
         └─ Convex.NegateAtom (constant; negative)
            └─ …

Solve the problem by calling solve!

solve!(problem, SCS.Optimizer; silent = true)

println("problem status is ", problem.status)
println("optimal value is ", problem.optval)
problem status is OPTIMAL
optimal value is 25.689290176242185
using Interact, Plots
# Interact.WebIO.install_jupyter_nbextension() # might be helpful if you see `WebIO` warnings in Jupyter
@manipulate throttle = 0.1 for λ in 0:0.1:5, μ in 0:0.1:5
    global A
    problem = minimize(
        square(norm(A * x - b)) + λ * square(norm(x)) + μ * norm(x, 1),
        x >= 0,
    )
    solve!(problem, SCS.Optimizer; silent = true)
    histogram(evaluate(x), xlims = (0, 3.5), label = "x")
end

Quick convex prototyping

Variables

Scalar variable

x = Variable()
Variable
size: (1, 1)
sign: real
vexity: affine
id: 922…147

(Column) vector variable

y = Variable(4)
Variable
size: (4, 1)
sign: real
vexity: affine
id: 173…771

Matrix variable

Z = Variable(4, 4)
Variable
size: (4, 4)
sign: real
vexity: affine
id: 147…754

Expressions

Convex.jl allows you to use a wide variety of functions on variables and on expressions to form new expressions.

x + 2x
+ (affine; real)
├─ real variable (id: 922…147)
└─ * (affine; real)
   ├─ [2;;]
   └─ real variable (id: 922…147)
e = y[1] + logdet(Z) + sqrt(x) + minimum(y)
+ (concave; real)
├─ index (affine; real)
│  └─ 4-element real variable (id: 173…771)
├─ logdet (concave; real)
│  └─ 4×4 real variable (id: 147…754)
├─ geomean (concave; positive)
│  ├─ real variable (id: 922…147)
│  └─ [1.0;;]
⋮

Examine the expression tree

e.children[2]
logdet (concave; real)
└─ 4×4 real variable (id: 147…754)

Constraints

A constraint is convex if convex combinations of feasible points are also feasible. Equivalently, feasible sets are convex sets.

In other words, convex constraints are of the form

  • convexExpr <= 0
  • concaveExpr >= 0
  • affineExpr == 0
x <= 0
≤ constraint (affine)
└─ + (affine; real)
   ├─ real variable (id: 922…147)
   └─ [0;;]
square(x) <= sum(y)
≤ constraint (convex)
└─ + (convex; real)
   ├─ qol_elem (convex; positive)
   │  ├─ real variable (id: 922…147)
   │  └─ [1.0;;]
   └─ Convex.NegateAtom (affine; real)
      └─ sum (affine; real)
         └─ …
M = Z
for i in 1:length(y)
    global M += rand(size(Z)...) * y[i]
end
M ⪰ 0
PSD constraint (convex)
└─ + (affine; real)
   ├─ 4×4 real variable (id: 147…754)
   ├─ * (affine; real)
   │  ├─ 4×4 Matrix{Float64}
   │  └─ index (affine; real)
   │     └─ …
   ├─ * (affine; real)
   │  ├─ 4×4 Matrix{Float64}
   │  └─ index (affine; real)
   │     └─ …
   ⋮

Problems

x = Variable()
y = Variable(4)
objective = 2 * x + 1 - sqrt(sum(y))
constraint = x >= maximum(y)
p = minimize(objective, constraint)
Problem statistics
  problem is DCP         : true
  number of variables    : 2 (5 scalar elements)
  number of constraints  : 1 (1 scalar elements)
  number of coefficients : 3
  number of atoms        : 8

Solution summary
  termination status : OPTIMIZE_NOT_CALLED
  primal status      : NO_SOLUTION
  dual status        : NO_SOLUTION

Expression graph
  minimize
   └─ + (convex; real)
      ├─ * (affine; real)
      │  ├─ [2;;]
      │  └─ real variable (id: 821…646)
      ├─ [1;;]
      └─ Convex.NegateAtom (convex; negative)
         └─ geomean (concave; positive)
            ├─ …
            └─ …
  subject to
   └─ ≥ constraint (convex)
      └─ + (concave; real)
         ├─ real variable (id: 821…646)
         └─ Convex.NegateAtom (concave; real)
            └─ …

Solve the problem:

solve!(p, SCS.Optimizer; silent = true)
p.status
OPTIMAL::TerminationStatusCode = 1
evaluate(x)
0.2499992372452304

Can evaluate expressions directly:

evaluate(objective)
0.4999989328331398

Pass to solver

call a MathProgBase solver suited for your problem class

to solve problem using a different solver, just import the solver package and pass the solver to the solve! method:

using Mosek
solve!(p, Mosek.Optimizer)

Warmstart

Generate random problem data:

m = 50;
n = 100;
A = randn(m, n)
x♮ = sprand(n, 1, 0.5) # true (sparse nonnegative) parameter vector
noise = 0.1 * randn(m)    # gaussian noise
b = A * x♮ + noise      # noisy linear observations
50×1 Matrix{Float64}:
 -1.0813938133106962
  4.0264046756355345
 -4.039348933441501
 -5.279272338197081
 -4.616032868559761
  5.306945990702305
 -2.883570886421047
  3.475336050338825
  7.140299677062581
  2.9045225655784774
  ⋮
  2.4118139170692223
  0.7628503149946742
  2.172206721398036
 -0.06148079589661848
  2.057471254928378
 -1.5832099506496735
 -0.8160964195043132
  0.934467104884041
  6.959327482987776

Create a (column vector) variable of size n.

x = Variable(n)
Variable
size: (100, 1)
sign: real
vexity: affine
id: 178…432

nonnegative elastic net with regularization

λ = 1
μ = 1
problem = minimize(
    square(norm(A * x - b)) + λ * square(norm(x)) + μ * norm(x, 1),
    x >= 0,
)
@time solve!(problem, SCS.Optimizer; silent = true)
λ = 1.5
@time solve!(problem, SCS.Optimizer; silent = true)#, warmstart = true) # FIXME
Problem statistics
  problem is DCP         : true
  number of variables    : 1 (100 scalar elements)
  number of constraints  : 1 (100 scalar elements)
  number of coefficients : 5_155
  number of atoms        : 14

Solution summary
  termination status : OPTIMAL
  primal status      : FEASIBLE_POINT
  dual status        : FEASIBLE_POINT
  objective value    : 24.8818

Expression graph
  minimize
   └─ + (convex; positive)
      ├─ qol_elem (convex; positive)
      │  ├─ norm2 (convex; positive)
      │  │  └─ …
      │  └─ [1.0;;]
      ├─ * (convex; positive)
      │  ├─ [1;;]
      │  └─ qol_elem (convex; positive)
      │     ├─ …
      │     └─ …
      └─ * (convex; positive)
         ├─ [1;;]
         └─ sum (convex; positive)
            └─ …
  subject to
   └─ ≥ constraint (affine)
      └─ + (affine; real)
         ├─ 100-element real variable (id: 178…432)
         └─ Convex.NegateAtom (constant; negative)
            └─ …

DCP examples

  • affine
x = Variable(4)
y = Variable(2)
sum(x) + y[2]
+ (affine; real)
├─ sum (affine; real)
│  └─ 4-element real variable (id: 157…260)
└─ index (affine; real)
   └─ 2-element real variable (id: 511…645)
2 * maximum(x) + 4 * sum(y) - sqrt(y[1] + x[1]) - 7 * minimum(x[2:4])
+ (convex; real)
├─ * (convex; real)
│  ├─ [2;;]
│  └─ maximum (convex; real)
│     └─ 4-element real variable (id: 157…260)
├─ * (affine; real)
│  ├─ [4;;]
│  └─ sum (affine; real)
│     └─ 2-element real variable (id: 511…645)
├─ Convex.NegateAtom (convex; negative)
│  └─ geomean (concave; positive)
│     ├─ + (affine; real)
│     │  ├─ …
│     │  └─ …
│     └─ [1.0;;]
⋮
  • not DCP compliant
log(x) + square(x)
+ (Convex.NotDcp; real)
├─ log (concave; real)
│  └─ 4-element real variable (id: 157…260)
└─ qol_elem (convex; positive)
   ├─ 4-element real variable (id: 157…260)
   └─ 4×1 Matrix{Float64}
  • Composition $f\circ g$ where $f$ is convex increasing and $g$ is convex
square(pos(x))
qol_elem (convex; positive)
├─ max (convex; positive)
│  ├─ 4-element real variable (id: 157…260)
│  └─ [0;;]
└─ 4×1 Matrix{Float64}
  • Composition $f\circ g$ where $f$ is convex decreasing and $g$ is concave
invpos(sqrt(x))
qol_elem (convex; positive)
├─ 4×1 Matrix{Float64}
└─ geomean (concave; positive)
   ├─ 4-element real variable (id: 157…260)
   └─ 4×1 Matrix{Float64}
  • Composition $f\circ g$ where $f$ is concave increasing and $g$ is concave
sqrt(sqrt(x))
geomean (concave; positive)
├─ geomean (concave; positive)
│  ├─ 4-element real variable (id: 157…260)
│  └─ 4×1 Matrix{Float64}
└─ 4×1 Matrix{Float64}

This page was generated using Literate.jl.