Getting started with JuMP

This tutorial is aimed at providing a quick introduction to writing and solving optimization models with JuMP.

If you're new to Julia, start by reading Getting started with Julia.

What is JuMP?

JuMP ("Julia for Mathematical Programming") is an open-source modeling language that is embedded in Julia. It allows users to formulate various classes of optimization problems (linear, mixed-integer, quadratic, conic quadratic, semidefinite, and nonlinear) with easy-to-read code. These problems can then be solved using state-of-the-art open-source and commercial solvers.

JuMP also makes advanced optimization techniques easily accessible from a high-level language.

What is a solver?

A solver is a software package that incorporates algorithms for finding solutions to one or more classes of problem.

For example, GLPK is a solver for linear programming (LP) and mixed integer programming (MIP) problems. It incorporates algorithms such as the simplex method and the interior-point method.

The Supported-solvers table lists the open-source and commercial solvers that JuMP currently supports.

What is MathOptInterface?

Each solver has its own concepts and data structures for representing optimization models and obtaining results.

MathOptInterface (MOI) is an abstraction layer that JuMP uses to convert from the problem written in JuMP to the solver-specific data structures for each solver.

MOI can be used directly, or through a higher-level modeling interface like JuMP.

Because JuMP is built on top of MOI, you'll often see the MathOptInterface. prefix displayed when JuMP types are printed. However, you'll only need to understand and interact with MOI to accomplish advanced tasks such as creating solver-independent callbacks.

Installation

JuMP is a package for Julia. From Julia, JuMP is installed by using the built-in package manager.

import Pkg
Pkg.add("JuMP")

You also need to include a Julia package which provides an appropriate solver. One such solver is GLPK.Optimizer, which is provided by the GLPK.jl package.

import Pkg
Pkg.add("GLPK")

See Installation Guide for a list of other solvers you can use.

An example

Let's to solve the following linear programming problem using JuMP and GLPK. We will first look at the complete code to solve the problem and then go through it step by step.

Here's the problem:

\[\begin{aligned} & \min & 12x + 20y \\ & \;\;\text{s.t.} & 6x + 8y \geq 100 \\ & & 7x + 12y \geq 120 \\ & & x \geq 0 \\ & & y \in [0, 3] \\ \end{aligned}\]

And here's the code to solve this problem:

using JuMP
using GLPK
model = Model(GLPK.Optimizer)
@variable(model, x >= 0)
@variable(model, 0 <= y <= 3)
@objective(model, Min, 12x + 20y)
@constraint(model, c1, 6x + 8y >= 100)
@constraint(model, c2, 7x + 12y >= 120)
print(model)
optimize!(model)
@show termination_status(model)
@show primal_status(model)
@show dual_status(model)
@show objective_value(model)
@show value(x)
@show value(y)
@show shadow_price(c1)
@show shadow_price(c2)
Min 12 x + 20 y
Subject to
 c1 : 6 x + 8 y ≥ 100.0
 c2 : 7 x + 12 y ≥ 120.0
 x ≥ 0.0
 y ≥ 0.0
 y ≤ 3.0
termination_status(model) = MathOptInterface.OPTIMAL
primal_status(model) = MathOptInterface.FEASIBLE_POINT
dual_status(model) = MathOptInterface.FEASIBLE_POINT
objective_value(model) = 204.99999999999997
value(x) = 15.000000000000005
value(y) = 1.249999999999996
shadow_price(c1) = -0.24999999999999922
shadow_price(c2) = -1.5000000000000007

Step-by-step

Once JuMP is installed, to use JuMP in your programs write:

using JuMP

We also need to include a Julia package which provides an appropriate solver. We want to use GLPK.Optimizer here which is provided by the GLPK.jl package.

using GLPK

JuMP builds problems incrementally in a Model object. Create a model by passing an optimizer to the Model function:

model = Model(GLPK.Optimizer)
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: EMPTY_OPTIMIZER
Solver name: GLPK

Variables are modeled using @variable:

@variable(model, x >= 0)

\[ x \]

They can have lower and upper bounds.

@variable(model, 0 <= y <= 30)

\[ y \]

The objective is set using @objective:

@objective(model, Min, 12x + 20y)

\[ 12 x + 20 y \]

Constraints are modeled using @constraint. Here, c1 and c2 are the names of our constraint.

@constraint(model, c1, 6x + 8y >= 100)

c1 : $ 6 x + 8 y \geq 100.0 $

@constraint(model, c2, 7x + 12y >= 120)

c2 : $ 7 x + 12 y \geq 120.0 $

Call print to display the model:

print(model)
Min 12 x + 20 y
Subject to
 c1 : 6 x + 8 y ≥ 100.0
 c2 : 7 x + 12 y ≥ 120.0
 x ≥ 0.0
 y ≥ 0.0
 y ≤ 30.0

To solve the optimization problem, call the optimize! function.

optimize!(model)
Info

The ! after optimize is part of the name. It's nothing special. Julia has a convention that functions which mutate their arguments should end in !. A common example is push!.

Now let's see what information we can query about the solution.

termination_status tells us why the solver stopped:

termination_status(model)
OPTIMAL::TerminationStatusCode = 1

In this case, the solver found an optimal solution.

Check primal_status to see if the solver found a primal feasible point:

primal_status(model)
FEASIBLE_POINT::ResultStatusCode = 1

and dual_status to see if the solver found a dual feasible point:

dual_status(model)
FEASIBLE_POINT::ResultStatusCode = 1

Now we know that our solver found an optimal solution, and that it has a primal and a dual solution to query.

Query the objective value using objective_value:

objective_value(model)
205.0

the primal solution using value:

value(x)
14.999999999999993
value(y)
1.2500000000000047

and the dual solution using shadow_price:

shadow_price(c1)
-0.249999999999999
shadow_price(c2)
-1.5000000000000007

That's it for our simple model. In the rest of this tutorial, we expand on some of the basic JuMP operations.

Model basics

Create a model by passing an optimizer:

model = Model(GLPK.Optimizer)
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: EMPTY_OPTIMIZER
Solver name: GLPK

Alternatively, call set_optimizer at any point before calling optimize!:

model = Model()
set_optimizer(model, GLPK.Optimizer)

For some solvers, you can also use direct_model, which offers a more efficient connection to the underlying solver:

model = direct_model(GLPK.Optimizer())
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: DIRECT
Solver name: GLPK
Warning

Some solvers do not support direct_model!

Solver Options

Pass options to solvers with optimizer_with_attributes:

model = Model(optimizer_with_attributes(GLPK.Optimizer, "msg_lev" => 0))
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: EMPTY_OPTIMIZER
Solver name: GLPK
Note

These options are solver-specific. To find out the various options available, see the GitHub README of the individual solver packages. The link to each solver's GitHub page is in the Supported solvers table.

You can also pass options with set_optimizer_attribute

model = Model(GLPK.Optimizer)
set_optimizer_attribute(model, "msg_lev", 0)

Solution basics

We saw above how to use termination_status and primal_status to understand the solution returned by the solver.

However, only query solution attributes like value and objective_value if there is an available solution. Here's a recommended way to check:

function solve_infeasible()
    model = Model(GLPK.Optimizer)
    @variable(model, 0 <= x <= 1)
    @variable(model, 0 <= y <= 1)
    @constraint(model, x + y >= 3)
    @objective(model, Max, x + 2y)
    optimize!(model)
    if termination_status(model) != OPTIMAL
        @warn("The model was not solved correctly.")
        return nothing
    end
    return value(x), value(y)
end

solve_infeasible()
┌ Warning: The model was not solved correctly.
└ @ Main getting_started_with_JuMP.md:301

Variable basics

Let's create a new empty model to explain some of the variable syntax:

model = Model()
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.

Variable bounds

All of the variables we have created till now have had a bound. We can also create a free variable.

@variable(model, free_x)

\[ free\_x \]

While creating a variable, instead of using the <= and >= syntax, we can also use the lower_bound and upper_bound keyword arguments.

@variable(model, keyword_x, lower_bound = 1, upper_bound = 2)

\[ keyword\_x \]

We can query whether a variable has a bound using the has_lower_bound and has_upper_bound functions. The values of the bound can be obtained using the lower_bound and upper_bound functions.

has_upper_bound(keyword_x)
true
upper_bound(keyword_x)
2.0

Note querying the value of a bound that does not exist will result in an error.

    lower_bound(free_x)
Variable free_x does not have a lower bound.

Containers

We have already seen how to add a single variable to a model using the @variable macro. Now let's look at ways to add multiple variables to a model.

JuMP provides data structures for adding collections of variables to a model. These data structures are referred to as containers and are of three types: Arrays, DenseAxisArrays, and SparseAxisArrays.

Arrays

JuMP arrays are created when you have integer indices that start at 1:

@variable(model, a[1:2, 1:2])
2×2 Matrix{VariableRef}:
 a[1,1]  a[1,2]
 a[2,1]  a[2,2]

Create an n-dimensional variable $x \in {R}^n$ with bounds $l \le x \le u$ ($l, u \in {R}^n$) as follows:

n = 10
l = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
u = [10, 11, 12, 13, 14, 15, 16, 17, 18, 19]

@variable(model, l[i] <= x[i = 1:n] <= u[i])
10-element Vector{VariableRef}:
 x[1]
 x[2]
 x[3]
 x[4]
 x[5]
 x[6]
 x[7]
 x[8]
 x[9]
 x[10]

We can also create variable bounds that depend upon the indices:

@variable(model, y[i = 1:2, j = 1:2] >= 2i + j)
2×2 Matrix{VariableRef}:
 y[1,1]  y[1,2]
 y[2,1]  y[2,2]

DenseAxisArrays

DenseAxisArrays are used when the indices are not one-based integer ranges. The syntax is similar except with an arbitrary vector as an index as opposed to a one-based range:

@variable(model, z[i = 2:3, j = 1:2:3] >= 0)
2-dimensional DenseAxisArray{VariableRef,2,...} with index sets:
    Dimension 1, 2:3
    Dimension 2, 1:2:3
And data, a 2×2 Matrix{VariableRef}:
 z[2,1]  z[2,3]
 z[3,1]  z[3,3]

Indices do not have to be integers. They can be any Julia type:

@variable(model, w[1:5, ["red", "blue"]] <= 1)
2-dimensional DenseAxisArray{VariableRef,2,...} with index sets:
    Dimension 1, Base.OneTo(5)
    Dimension 2, ["red", "blue"]
And data, a 5×2 Matrix{VariableRef}:
 w[1,red]  w[1,blue]
 w[2,red]  w[2,blue]
 w[3,red]  w[3,blue]
 w[4,red]  w[4,blue]
 w[5,red]  w[5,blue]

SparseAxisArrays

SparseAxisArrays are created when the indices do not form a rectangular set. For example, this applies when indices have a dependence upon previous indices (called triangular indexing):

@variable(model, u[i = 1:2, j = i:3])
JuMP.Containers.SparseAxisArray{VariableRef, 2, Tuple{Int64, Int64}} with 5 entries:
  [1, 1]  =  u[1,1]
  [1, 2]  =  u[1,2]
  [1, 3]  =  u[1,3]
  [2, 2]  =  u[2,2]
  [2, 3]  =  u[2,3]

We can also conditionally create variables by adding a comparison check that depends upon the named indices and is separated from the indices by a semi-colon ;:

@variable(model, v[i = 1:9; mod(i, 3) == 0])
JuMP.Containers.SparseAxisArray{VariableRef, 1, Tuple{Int64}} with 3 entries:
  [3]  =  v[3]
  [6]  =  v[6]
  [9]  =  v[9]

Integrality

JuMP can create binary and integer variables. Binary variables are constrained to the set $\{0, 1\}$, and integer variables are constrained to the set $\mathbb{Z}$.

Integer variables

Create an integer variable by passing Int:

@variable(model, integer_x, Int)

\[ integer\_x \]

or setting the integer keyword to true:

@variable(model, integer_z, integer = true)

\[ integer\_z \]

Binary variables

Create a binary variable by passing Bin:

@variable(model, binary_x, Bin)

\[ binary\_x \]

or setting the binary keyword to true:

@variable(model, binary_z, binary = true)

\[ binary\_z \]

Constraint basics

We'll need a need a new model to explain some of the constraint basics:

model = Model()
@variable(model, x)
@variable(model, y)
@variable(model, z[1:10]);

Containers

Just as we had containers for variables, JuMP also provides Arrays, DenseAxisArrays, and SparseAxisArrays for storing collections of constraints. Examples for each container type are given below.

Arrays

@constraint(model, [i = 1:3], i * x <= i + 1)
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}, ScalarShape}}:
 x ≤ 2.0
 2 x ≤ 3.0
 3 x ≤ 4.0

DenseAxisArrays

@constraint(model, [i = 1:2, j = 2:3], i * x <= j + 1)
2-dimensional DenseAxisArray{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}, ScalarShape},2,...} with index sets:
    Dimension 1, Base.OneTo(2)
    Dimension 2, 2:3
And data, a 2×2 Matrix{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}, ScalarShape}}:
 x ≤ 3.0    x ≤ 4.0
 2 x ≤ 3.0  2 x ≤ 4.0

SparseAxisArrays

@constraint(model, [i = 1:2, j = 1:2; i != j], i * x <= j + 1)
JuMP.Containers.SparseAxisArray{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}, ScalarShape}, 2, Tuple{Int64, Int64}} with 2 entries:
  [1, 2]  =  x ≤ 3.0
  [2, 1]  =  2 x ≤ 2.0

Constraints in a loop

We can add constraints using regular Julia loops:

for i in 1:3
    @constraint(model, 6x + 4y >= 5i)
end

or use for each loops inside the @constraint macro:

@constraint(model, [i in 1:3], 6x + 4y >= 5i)
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.GreaterThan{Float64}}, ScalarShape}}:
 6 x + 4 y ≥ 5.0
 6 x + 4 y ≥ 10.0
 6 x + 4 y ≥ 15.0

We can also create constraints such as $\sum _{i = 1}^{10} z_i \leq 1$:

@constraint(model, sum(z[i] for i in 1:10) <= 1)

\[ z_{1} + z_{2} + z_{3} + z_{4} + z_{5} + z_{6} + z_{7} + z_{8} + z_{9} + z_{10} \leq 1.0 \]

Objective functions

Set an objective function with @objective:

model = Model(GLPK.Optimizer)
@variable(model, x >= 0)
@variable(model, y >= 0)
@objective(model, Min, 2x + y)

\[ 2 x + y \]

Create a maximization objective using Max:

@objective(model, Max, 2x + y)

\[ 2 x + y \]

Tip

Calling @objective multiple times will over-write the previous objective. This can be useful when you want to solve the same problem with different objectives:

Vectorized syntax

We can also add constraints and an objective to JuMP using vectorized linear algebra. We'll illustrate this by solving an LP in standard form i.e.

\[\begin{aligned} & \min & c^T x \\ & \;\;\text{s.t.} & A x = b \\ & & x \ge 0 \end{aligned}\]

vector_model = Model(GLPK.Optimizer)

A = [
    1 1 9 5
    3 5 0 8
    2 0 6 13
]

b = [7; 3; 5]

c = [1; 3; 5; 2]

@variable(vector_model, x[1:4] >= 0)
@constraint(vector_model, A * x .== b)
@objective(vector_model, Min, c' * x)

optimize!(vector_model)
objective_value(vector_model)
4.9230769230769225

Tip

This tutorial was generated using Literate.jl. View the source .jl file on GitHub.