JuMP

This page lists the public API of JuMP.

Info

This page is an unstructured list of the JuMP API. For a more structured overview, read the Manual or Tutorial parts of this documentation.

Load all of the public the API into the current scope with:

using JuMP

Alternatively, load only the module with:

import JuMP

and then prefix all calls with JuMP. to create JuMP.<NAME>.

@build_constraint

JuMP.@build_constraintMacro
@build_constraint(constraint_expr)

Constructs a ScalarConstraint or VectorConstraint using the same machinery as @constraint but without adding the constraint to a model.

Constraints using broadcast operators like x .<= 1 are also supported and will create arrays of ScalarConstraint or VectorConstraint.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @build_constraint(2x >= 1)
ScalarConstraint{AffExpr, MathOptInterface.GreaterThan{Float64}}(2 x, MathOptInterface.GreaterThan{Float64}(1.0))
julia> model = Model();

julia> @variable(model, x[1:2]);

julia> @build_constraint(x .>= 0)
2-element Vector{ScalarConstraint{AffExpr, MathOptInterface.GreaterThan{Float64}}}:
 ScalarConstraint{AffExpr, MathOptInterface.GreaterThan{Float64}}(x[1], MathOptInterface.GreaterThan{Float64}(-0.0))
 ScalarConstraint{AffExpr, MathOptInterface.GreaterThan{Float64}}(x[2], MathOptInterface.GreaterThan{Float64}(-0.0))
source

@constraint

JuMP.@constraintMacro
@constraint(model, expr, args...; kwargs...)
@constraint(model, [index_sets...], expr, args...; kwargs...)
@constraint(model, name, expr, args...; kwargs...)
@constraint(model, name[index_sets...], expr, args...; kwargs...)

Add a constraint described by the expression expr.

The name argument is optional. If index sets are passed, a container is built and the constraint may depend on the indices of the index sets.

The expression expr may be one of following forms:

  • func in set, constraining the function func to belong to the set set, which is either a MOI.AbstractSet or one of the JuMP shortcuts like SecondOrderCone or PSDCone

  • a <op> b, where <op> is one of ==, , >=, , <=

  • l <= f <= u or u >= f >= l, constraining the expression f to lie between l and u

  • f(x) ⟂ x, which defines a complementarity constraint

  • z --> {expr}, which defines an indicator constraint that activates when z is 1

  • !z --> {expr}, which defines an indicator constraint that activates when z is 0

  • z <--> {expr}, which defines a reified constraint

  • expr := rhs, which defines a Boolean equality constraint

Broadcasted comparison operators like .== are also supported for the case when the left- and right-hand sides of the comparison operator are arrays.

JuMP extensions may additionally provide support for constraint expressions which are not listed here.

Keyword arguments

  • base_name: sets the name prefix used to generate constraint names. It corresponds to the constraint name for scalar constraints, otherwise, the constraint names are set to base_name[...] for each index ....

  • container = :Auto: force the container type by passing container = Array,

container = DenseAxisArray, container = SparseAxisArray, or any another container type which is supported by a JuMP extension.

  • set_string_name::Bool = true: control whether to set the MOI.ConstraintName attribute. Passing set_string_name = false can improve performance.

Other keyword arguments may be supported by JuMP extensions.

Example

julia> model = Model();

julia> @variable(model, x[1:3]);

julia> @variable(model, z, Bin);

julia> @constraint(model, x in SecondOrderCone())
[x[1], x[2], x[3]] ∈ MathOptInterface.SecondOrderCone(3)

julia> @constraint(model, [i in 1:3], x[i] == i)
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.EqualTo{Float64}}, ScalarShape}}:
 x[1] = 1
 x[2] = 2
 x[3] = 3

julia> @constraint(model, x .== [1, 2, 3])
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.EqualTo{Float64}}, ScalarShape}}:
 x[1] = 1
 x[2] = 2
 x[3] = 3

julia> @constraint(model, con_name, 1 <= x[1] + x[2] <= 3)
con_name : x[1] + x[2] ∈ [1, 3]

julia> @constraint(model, con_perp[i in 1:3], x[i] - 1 ⟂ x[i])
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.VectorAffineFunction{Float64}, MathOptInterface.Complements}, VectorShape}}:
 con_perp[1] : [x[1] - 1, x[1]] ∈ MathOptInterface.Complements(2)
 con_perp[2] : [x[2] - 1, x[2]] ∈ MathOptInterface.Complements(2)
 con_perp[3] : [x[3] - 1, x[3]] ∈ MathOptInterface.Complements(2)

julia> @constraint(model, z --> {x[1] >= 0})
z --> {x[1] ≥ 0}

julia> @constraint(model, !z --> {2 * x[2] <= 3})
!z --> {2 x[2] ≤ 3}
source

@constraints

JuMP.@constraintsMacro
@constraints(model, args...)

Adds groups of constraints at once, in the same fashion as the @constraint macro.

The model must be the first argument, and multiple constraints can be added on multiple lines wrapped in a begin ... end block.

The macro returns a tuple containing the constraints that were defined.

Example

julia> model = Model();

julia> @variable(model, w);

julia> @variable(model, x);

julia> @variable(model, y);

julia> @variable(model, z[1:3]);

julia> @constraints(model, begin
           x >= 1
           y - w <= 2
           sum_to_one[i=1:3], z[i] + y == 1
       end);

julia> print(model)
Feasibility
Subject to
 sum_to_one[1] : y + z[1] = 1
 sum_to_one[2] : y + z[2] = 1
 sum_to_one[3] : y + z[3] = 1
 x ≥ 1
 -w + y ≤ 2
source

@expression

JuMP.@expressionMacro
@expression(model::GenericModel, expression)
@expression(model::GenericModel, [index_sets...], expression)
@expression(model::GenericModel, name, expression)
@expression(model::GenericModel, name[index_sets...], expression)

Efficiently builds and returns an expression.

The name argument is optional. If index sets are passed, a container is built and the expression may depend on the indices of the index sets.

Keyword arguments

  • container = :Auto: force the container type by passing container = Array, container = DenseAxisArray, container = SparseAxisArray, or any another container type which is supported by a JuMP extension.

Example

julia> model = Model();

julia> @variable(model, x[1:5]);

julia> @expression(model, shared, sum(i * x[i] for i in 1:5))
x[1] + 2 x[2] + 3 x[3] + 4 x[4] + 5 x[5]

julia> shared
x[1] + 2 x[2] + 3 x[3] + 4 x[4] + 5 x[5]

In the same way as @variable, the second argument may define index sets, and those indices can be used in the construction of the expressions:

julia> model = Model();

julia> @variable(model, x[1:3]);

julia> @expression(model, expr[i = 1:3], i * sum(x[j] for j in 1:3))
3-element Vector{AffExpr}:
 x[1] + x[2] + x[3]
 2 x[1] + 2 x[2] + 2 x[3]
 3 x[1] + 3 x[2] + 3 x[3]

Anonymous syntax is also supported:

julia> model = Model();

julia> @variable(model, x[1:3]);

julia> expr = @expression(model, [i in 1:3], i * sum(x[j] for j in 1:3))
3-element Vector{AffExpr}:
 x[1] + x[2] + x[3]
 2 x[1] + 2 x[2] + 2 x[3]
 3 x[1] + 3 x[2] + 3 x[3]
source

@expressions

JuMP.@expressionsMacro
@expressions(model, args...)

Adds multiple expressions to model at once, in the same fashion as the @expression macro.

The model must be the first argument, and multiple expressions can be added on multiple lines wrapped in a begin ... end block.

The macro returns a tuple containing the expressions that were defined.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @variable(model, y);

julia> @variable(model, z[1:2]);

julia> a = [4, 5];

julia> @expressions(model, begin
           my_expr, x^2 + y^2
           my_expr_1[i = 1:2], a[i] - z[i]
       end)
(x² + y², AffExpr[-z[1] + 4, -z[2] + 5])
source

@force_nonlinear

JuMP.@force_nonlinearMacro
@force_nonlinear(expr)

Change the parsing of expr to construct GenericNonlinearExpr instead of GenericAffExpr or GenericQuadExpr.

This macro works by walking expr and substituting all calls to +, -, *, /, and ^ in favor of ones that construct GenericNonlinearExpr.

This macro will error if the resulting expression does not produce a GenericNonlinearExpr because, for example, it is used on an expression that does not use the basic arithmetic operators.

When to use this macro

In most cases, you should not use this macro.

Use this macro only if the intended output type is a GenericNonlinearExpr and the regular macro calls destroy problem structure, or in rare cases, if the regular macro calls introduce a large amount of intermediate variables, for example, because they promote types to a common quadratic expression.

Example

Use-case one: preserve problem structure.

julia> model = Model();

julia> @variable(model, x);

julia> @expression(model, (x - 0.1)^2)
x² - 0.2 x + 0.010000000000000002

julia> @expression(model, @force_nonlinear((x - 0.1)^2))
(x - 0.1) ^ 2

julia> (x - 0.1)^2
x² - 0.2 x + 0.010000000000000002

julia> @force_nonlinear((x - 0.1)^2)
(x - 0.1) ^ 2

Use-case two: reduce allocations

In this example, we know that x * 2.0 * (1 + x) * x is going to construct a nonlinear expression.

However, the default parsing first constructs:

In contrast, the modified parsing constructs:

This results in significantly fewer allocations.

julia> model = Model();

julia> @variable(model, x);

julia> @expression(model, x * 2.0 * (1 + x) * x)
(2 x² + 2 x) * x

julia> @expression(model, @force_nonlinear(x * 2.0 * (1 + x) * x))
x * 2.0 * (1 + x) * x

julia> @allocated @expression(model, x * 2.0 * (1 + x) * x)
2640

julia> @allocated @expression(model, @force_nonlinear(x * 2.0 * (1 + x) * x))
672
source

@objective

JuMP.@objectiveMacro
@objective(model::GenericModel, sense, func)

Set the objective sense to sense and objective function to func.

The objective sense can be either Min, Max, MOI.MIN_SENSE, MOI.MAX_SENSE or MOI.FEASIBILITY_SENSE. In order to set the sense programmatically, that is, when sense is a variable whose value is the sense, one of the three MOI.OptimizationSense values must be used.

Example

Minimize the value of the variable x, do:

julia> model = Model();

julia> @variable(model, x)
x

julia> @objective(model, Min, x)
x

Maximize the value of the affine expression 2x - 1:

julia> model = Model();

julia> @variable(model, x)
x

julia> @objective(model, Max, 2x - 1)
2 x - 1

Set the objective sense programmatically:

julia> model = Model();

julia> @variable(model, x)
x

julia> sense = MIN_SENSE
MIN_SENSE::OptimizationSense = 0

julia> @objective(model, sense, x^2 - 2x + 1)
x² - 2 x + 1
source

@operator

JuMP.@operatorMacro
@operator(model, operator, dim, f[, ∇f[, ∇²f]])

Add the nonlinear operator operator in model with dim arguments, and create a new NonlinearOperator object called operator in the current scope.

The function f evaluates the operator and must return a scalar.

The optional function ∇f evaluates the first derivative, and the optional function ∇²f evaluates the second derivative.

∇²f may be provided only if ∇f is also provided.

Univariate syntax

If dim == 1, then the method signatures of each function must be:

  • f(::T)::T where {T<:Real}
  • ∇f(::T)::T where {T<:Real}
  • ∇²f(::T)::T where {T<:Real}

Multivariate syntax

If dim > 1, then the method signatures of each function must be:

  • f(x::T...)::T where {T<:Real}
  • ∇f(g::AbstractVector{T}, x::T...)::Nothing where {T<:Real}
  • ∇²f(H::AbstractMatrix{T}, x::T...)::Nothing where {T<:Real}

Where the gradient vector g and Hessian matrix H are filled in-place. For the Hessian, you must fill in the non-zero lower-triangular entries only. Setting an off-diagonal upper-triangular element may error.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> f(x::Float64) = x^2
f (generic function with 1 method)

julia> ∇f(x::Float64) = 2 * x
∇f (generic function with 1 method)

julia> ∇²f(x::Float64) = 2.0
∇²f (generic function with 1 method)

julia> @operator(model, op_f, 1, f, ∇f, ∇²f)
NonlinearOperator(f, :op_f)

julia> @objective(model, Min, op_f(x))
op_f(x)

julia> op_f(2.0)
4.0

julia> model[:op_f]
NonlinearOperator(f, :op_f)

julia> model[:op_f](x)
op_f(x)

Non-macro version

This macro is provided as helpful syntax that matches the style of the rest of the JuMP macros. However, you may also add operators outside the macro using add_nonlinear_operator. For example:

julia> model = Model();

julia> f(x) = x^2
f (generic function with 1 method)

julia> @operator(model, op_f, 1, f)
NonlinearOperator(f, :op_f)

is equivalent to

julia> model = Model();

julia> f(x) = x^2
f (generic function with 1 method)

julia> op_f = model[:op_f] = add_nonlinear_operator(model, 1, f; name = :op_f)
NonlinearOperator(f, :op_f)
source

@variable

JuMP.@variableMacro
@variable(model, expr, args..., kw_args...)

Add a variable to the model model described by the expression expr, the positional arguments args and the keyword arguments kw_args.

Anonymous and named variables

expr must be one of the forms:

  • Omitted, like @variable(model), which creates an anonymous variable
  • A single symbol like @variable(model, x)
  • A container expression like @variable(model, x[i=1:3])
  • An anonymous container expression like @variable(model, [i=1:3])

Bounds

In addition, the expression can have bounds, such as:

  • @variable(model, x >= 0)
  • @variable(model, x <= 0)
  • @variable(model, x == 0)
  • @variable(model, 0 <= x <= 1)

and bounds can depend on the indices of the container expressions:

  • @variable(model, -i <= x[i=1:3] <= i)

Sets

You can explicitly specify the set to which the variable belongs:

  • @variable(model, x in MOI.Interval(0.0, 1.0))

For more information on this syntax, read Variables constrained on creation.

Positional arguments

The recognized positional arguments in args are the following:

  • Bin: restricts the variable to the MOI.ZeroOne set, that is, {0, 1}. For example, @variable(model, x, Bin). Note: you cannot use @variable(model, Bin), use the binary keyword instead.
  • Int: restricts the variable to the set of integers, that is, ..., -2, -1, 0, 1, 2, ... For example, @variable(model, x, Int). Note: you cannot use @variable(model, Int), use the integer keyword instead.
  • Symmetric: Only available when creating a square matrix of variables, that is when expr is of the form varname[1:n,1:n] or varname[i=1:n,j=1:n], it creates a symmetric matrix of variables.
  • PSD: A restrictive extension to Symmetric which constraints a square matrix of variables to Symmetric and constrains to be positive semidefinite.

Keyword arguments

Four keyword arguments are useful in all cases:

  • base_name: Sets the name prefix used to generate variable names. It corresponds to the variable name for scalar variable, otherwise, the variable names are set to base_name[...] for each index ... of the axes axes.
  • start::Float64: specify the value passed to set_start_value for each variable
  • container: specify the container type. See Forcing the container type for more information.
  • set_string_name::Bool = true: control whether to set the MOI.VariableName attribute. Passing set_string_name = false can improve performance.

Other keyword arguments are needed to disambiguate sitations with anonymous variables:

  • lower_bound::Float64: an alternative to x >= lb, sets the value of the variable lower bound.
  • upper_bound::Float64: an alternative to x <= ub, sets the value of the variable upper bound.
  • binary::Bool: an alternative to passing Bin, sets whether the variable is binary or not.
  • integer::Bool: an alternative to passing Int, sets whether the variable is integer or not.
  • set::MOI.AbstractSet: an alternative to using x in set
  • variable_type: used by JuMP extensions. See Extend @variable for more information.

Example

The following are equivalent ways of creating a variable x of name x with lower bound 0:

julia> model = Model();

julia> @variable(model, x >= 0)
x
julia> model = Model();

julia> @variable(model, x, lower_bound = 0)
x
julia> model = Model();

julia> x = @variable(model, base_name = "x", lower_bound = 0)
x

Other examples:

julia> model = Model();

julia> @variable(model, x[i=1:3] <= i, Int, start = sqrt(i), lower_bound = -i)
3-element Vector{VariableRef}:
 x[1]
 x[2]
 x[3]

julia> @variable(model, y[i=1:3], container = DenseAxisArray, set = MOI.ZeroOne())
1-dimensional DenseAxisArray{VariableRef,1,...} with index sets:
    Dimension 1, Base.OneTo(3)
And data, a 3-element Vector{VariableRef}:
 y[1]
 y[2]
 y[3]

julia> @variable(model, z[i=1:3], set_string_name = false)
3-element Vector{VariableRef}:
 _[7]
 _[8]
 _[9]
source

@variables

JuMP.@variablesMacro
@variables(model, args...)

Adds multiple variables to model at once, in the same fashion as the @variable macro.

The model must be the first argument, and multiple variables can be added on multiple lines wrapped in a begin ... end block.

The macro returns a tuple containing the variables that were defined.

Example

julia> model = Model();

julia> @variables(model, begin
           x
           y[i = 1:2] >= 0, (start = i)
           z, Bin, (start = 0, base_name = "Z")
       end)
(x, VariableRef[y[1], y[2]], Z)
Note

Keyword arguments must be contained within parentheses (refer to the example above).

source

add_bridge

JuMP.add_bridgeFunction
add_bridge(
    model::GenericModel{T},
    BT::Type{<:MOI.Bridges.AbstractBridge};
    coefficient_type::Type{S} = T,
) where {T,S}

Add BT{T} to the list of bridges that can be used to transform unsupported constraints into an equivalent formulation using only constraints supported by the optimizer.

See also: remove_bridge.

Example

julia> model = Model();

julia> add_bridge(model, MOI.Bridges.Constraint.SOCtoNonConvexQuadBridge)

julia> add_bridge(
           model,
           MOI.Bridges.Constraint.NumberConversionBridge;
           coefficient_type = Complex{Float64}
       )
source

add_constraint

JuMP.add_constraintFunction
add_constraint(
    model::GenericModel,
    con::AbstractConstraint,
    name::String= "",
)

This method should only be implemented by developers creating JuMP extensions. It should never be called by users of JuMP.

source

add_nonlinear_operator

JuMP.add_nonlinear_operatorFunction
add_nonlinear_operator(
    model::Model,
    dim::Int,
    f::Function,
    [∇f::Function,]
    [∇²f::Function];
    [name::Symbol = Symbol(f),]
)

Add a new nonlinear operator with dim input arguments to model and associate it with the name name.

The function f evaluates the operator and must return a scalar.

The optional function ∇f evaluates the first derivative, and the optional function ∇²f evaluates the second derivative.

∇²f may be provided only if ∇f is also provided.

Univariate syntax

If dim == 1, then the method signatures of each function must be:

  • f(::T)::T where {T<:Real}
  • ∇f(::T)::T where {T<:Real}
  • ∇²f(::T)::T where {T<:Real}

Multivariate syntax

If dim > 1, then the method signatures of each function must be:

  • f(x::T...)::T where {T<:Real}
  • ∇f(g::AbstractVector{T}, x::T...)::Nothing where {T<:Real}
  • ∇²f(H::AbstractMatrix{T}, x::T...)::Nothing where {T<:Real}

Where the gradient vector g and Hessian matrix H are filled in-place. For the Hessian, you must fill in the non-zero lower-triangular entries only. Setting an off-diagonal upper-triangular element may error.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> f(x::Float64) = x^2
f (generic function with 1 method)

julia> ∇f(x::Float64) = 2 * x
∇f (generic function with 1 method)

julia> ∇²f(x::Float64) = 2.0
∇²f (generic function with 1 method)

julia> op_f = add_nonlinear_operator(model, 1, f, ∇f, ∇²f)
NonlinearOperator(f, :f)

julia> @objective(model, Min, op_f(x))
f(x)

julia> op_f(2.0)
4.0
source

add_to_expression!

JuMP.add_to_expression!Function
add_to_expression!(expression, terms...)

Updates expression in-place to expression + (*)(terms...).

This is typically much more efficient than expression += (*)(terms...) because it avoids the temorary allocation of the right-hand side term.

For example, add_to_expression!(expression, a, b) produces the same result as expression += a*b, and add_to_expression!(expression, a) produces the same result as expression += a.

When to implement

Only a few methods are defined, mostly for internal use, and only for the cases when:

  1. they can be implemented efficiently
  2. expression is capable of storing the result. For example, add_to_expression!(::AffExpr, ::GenericVariableRef, ::GenericVariableRef) is not defined because a GenericAffExpr cannot store the product of two variables.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> expr = 2 + x
x + 2

julia> add_to_expression!(expr, 3, x)
4 x + 2

julia> expr
4 x + 2
julia> model = Model();

julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
 x[1]
 x[2]

julia> @expression(model, ex1, sum(x))
x[1] + x[2]

julia> @expression(model, ex2, 2 * sum(x))
2 x[1] + 2 x[2]

julia> add_to_expression!(ex1, ex2)
3 x[1] + 3 x[2]

julia> ex1
3 x[1] + 3 x[2]

julia> ex2
2 x[1] + 2 x[2]
source

add_to_function_constant

JuMP.add_to_function_constantFunction
add_to_function_constant(constraint::ConstraintRef, value)

Add value to the function constant term of constraint.

Note that for scalar constraints, JuMP will aggregate all constant terms onto the right-hand side of the constraint so instead of modifying the function, the set will be translated by -value. For example, given a constraint 2x <= 3, add_to_function_constant(c, 4) will modify it to 2x <= -1.

Example

For scalar constraints, the set is translated by -value:

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, con, 0 <= 2x - 1 <= 2)
con : 2 x ∈ [1, 3]

julia> add_to_function_constant(con, 4)

julia> con
con : 2 x ∈ [-3, -1]

For vector constraints, the constant is added to the function:

julia> model = Model();

julia> @variable(model, x);

julia> @variable(model, y);

julia> @constraint(model, con, [x + y, x, y] in SecondOrderCone())
con : [x + y, x, y] ∈ MathOptInterface.SecondOrderCone(3)

julia> add_to_function_constant(con, [1, 2, 2])

julia> con
con : [x + y + 1, x + 2, y + 2] ∈ MathOptInterface.SecondOrderCone(3)
source

add_variable

JuMP.add_variableFunction
add_variable(m::GenericModel, v::AbstractVariable, name::String = "")

This method should only be implemented by developers creating JuMP extensions. It should never be called by users of JuMP.

source

all_constraints

JuMP.all_constraintsFunction
all_constraints(model::GenericModel, function_type, set_type)::Vector{<:ConstraintRef}

Return a list of all constraints currently in the model where the function has type function_type and the set has type set_type. The constraints are ordered by creation time.

See also list_of_constraint_types and num_constraints.

Example

julia> model = Model();

julia> @variable(model, x >= 0, Bin);

julia> @constraint(model, 2x <= 1);

julia> all_constraints(model, VariableRef, MOI.GreaterThan{Float64})
1-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.VariableIndex, MathOptInterface.GreaterThan{Float64}}, ScalarShape}}:
 x ≥ 0

julia> all_constraints(model, VariableRef, MOI.ZeroOne)
1-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.VariableIndex, MathOptInterface.ZeroOne}, ScalarShape}}:
 x binary

julia> all_constraints(model, AffExpr, MOI.LessThan{Float64})
1-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}, ScalarShape}}:
 2 x ≤ 1
source
all_constraints(
    model::GenericModel;
    include_variable_in_set_constraints::Bool,
)::Vector{ConstraintRef}

Return a list of all constraints in model.

If include_variable_in_set_constraints == true, then VariableRef constraints such as VariableRef-in-Integer are included. To return only the structural constraints (for example, the rows in the constraint matrix of a linear program), pass include_variable_in_set_constraints = false.

Example

julia> model = Model();

julia> @variable(model, x >= 0, Int);

julia> @constraint(model, 2x <= 1);

julia> @NLconstraint(model, x^2 <= 1);

julia> all_constraints(model; include_variable_in_set_constraints = true)
4-element Vector{ConstraintRef}:
 2 x ≤ 1
 x ≥ 0
 x integer
 x ^ 2.0 - 1.0 ≤ 0

julia> all_constraints(model; include_variable_in_set_constraints = false)
2-element Vector{ConstraintRef}:
 2 x ≤ 1
 x ^ 2.0 - 1.0 ≤ 0

Performance considerations

Note that this function is type-unstable because it returns an abstractly typed vector. If performance is a problem, consider using list_of_constraint_types and a function barrier. See the Performance tips for extensions section of the documentation for more details.

source

all_variables

JuMP.all_variablesFunction
all_variables(model::GenericModel{T})::Vector{GenericVariableRef{T}} where {T}

Returns a list of all variables currently in the model. The variables are ordered by creation time.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @variable(model, y);

julia> all_variables(model)
2-element Vector{VariableRef}:
 x
 y
source

anonymous_name

JuMP.anonymous_nameFunction
anonymous_name(::MIME, x::AbstractVariableRef)

The name to use for an anonymous variable x when printing.

Example

julia> model = Model();

julia> x = @variable(model);

julia> anonymous_name(MIME("text/plain"), x)
"_[1]"
source

backend

JuMP.backendFunction
backend(model::GenericModel)

Return the lower-level MathOptInterface model that sits underneath JuMP. This model depends on which operating mode JuMP is in (see mode).

  • If JuMP is in DIRECT mode (that is, the model was created using direct_model), the backend will be the optimizer passed to direct_model.
  • If JuMP is in MANUAL or AUTOMATIC mode, the backend is a MOI.Utilities.CachingOptimizer.

Use index to get the index of a variable or constraint in the backend model.

Warning

This function should only be used by advanced users looking to access low-level MathOptInterface or solver-specific functionality.

Notes

If JuMP is not in DIRECT mode, the type returned by backend may change between any JuMP releases. Therefore, only use the public API exposed by MathOptInterface, and do not access internal fields. If you require access to the innermost optimizer, see unsafe_backend. Alternatively, use direct_model to create a JuMP model in DIRECT mode.

See also: unsafe_backend.

Example

julia> import HiGHS

julia> model = direct_model(HiGHS.Optimizer());

julia> set_silent(model)

julia> @variable(model, x >= 0)
x

julia> highs = backend(model)
A HiGHS model with 1 columns and 0 rows.

julia> index(x)
MOI.VariableIndex(1)
source

barrier_iterations

JuMP.barrier_iterationsFunction
barrier_iterations(model::GenericModel)

If available, returns the cumulative number of barrier iterations during the most-recent optimization (the MOI.BarrierIterations attribute).

Throws a MOI.GetAttributeNotAllowed error if the attribute is not implemented by the solver.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> optimize!(model)

julia> barrier_iterations(model)
0
source

bridge_constraints

JuMP.bridge_constraintsFunction
bridge_constraints(model::GenericModel)

When in direct mode, return false.

When in manual or automatic mode, return a Bool indicating whether the optimizer is set and unsupported constraints are automatically bridged to equivalent supported constraints when an appropriate transformation is available.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> bridge_constraints(model)
true

julia> model = Model(Ipopt.Optimizer; add_bridges = false);

julia> bridge_constraints(model)
false
source

build_constraint

JuMP.build_constraintFunction
build_constraint(error_fn::Function, func, set, args...; kwargs...)

This method should only be implemented by developers creating JuMP extensions. It should never be called by users of JuMP.

source

build_variable

JuMP.build_variableFunction
build_variable(
    error_fn::Function,
    info::VariableInfo,
    args...;
    kwargs...,
)

Return a new AbstractVariable object.

This method should only be implemented by developers creating JuMP extensions. It should never be called by users of JuMP.

Arguments

  • error_fn: a function to call instead of error. error_fn annotates the error message with additional information for the user.
  • info: an instance of VariableInfo. This has a variety of fields relating to the variable such as info.lower_bound and info.binary.
  • args: optional additional positional arguments for extending the @variable macro.
  • kwargs: optional keyword arguments for extending the @variable macro.

See also: @variable

Warning

Extensions should define a method with ONE positional argument to dispatch the call to a different method. Creating an extension that relies on multiple positional arguments leads to MethodErrors if the user passes the arguments in the wrong order.

Example

@variable(model, x, Foo)

will call

build_variable(error_fn::Function, info::VariableInfo, ::Type{Foo})

Passing special-case positional arguments such as Bin, Int, and PSD is okay, along with keyword arguments:

@variable(model, x, Int, Foo(), mykwarg = true)
# or
@variable(model, x, Foo(), Int, mykwarg = true)

will call

build_variable(error_fn::Function, info::VariableInfo, ::Foo; mykwarg)

and info.integer will be true.

Note that the order of the positional arguments does not matter.

source

callback_node_status

JuMP.callback_node_statusFunction
callback_node_status(cb_data, model::GenericModel)

Return an MOI.CallbackNodeStatusCode enum, indicating if the current primal solution available from callback_value is integer feasible.

Example

julia> import GLPK

julia> model = Model(GLPK.Optimizer);

julia> @variable(model, x <= 10, Int);

julia> @objective(model, Max, x);

julia> function my_callback_function(cb_data)
           status = callback_node_status(cb_data, model)
           println("Status is: ", status)
           return
       end
my_callback_function (generic function with 1 method)

julia> set_attribute(model, GLPK.CallbackFunction(), my_callback_function)

julia> optimize!(model)
Status is: CALLBACK_NODE_STATUS_UNKNOWN
Status is: CALLBACK_NODE_STATUS_UNKNOWN
Status is: CALLBACK_NODE_STATUS_INTEGER
Status is: CALLBACK_NODE_STATUS_INTEGER
source

callback_value

JuMP.callback_valueFunction
callback_value(cb_data, x::GenericVariableRef)
callback_value(cb_data, x::Union{GenericAffExpr,GenericQuadExpr})

Return the primal solution of x inside a callback.

cb_data is the argument to the callback function, and the type is dependent on the solver.

Use callback_node_status to check whether a solution is available.

Example

julia> import GLPK

julia> model = Model(GLPK.Optimizer);

julia> @variable(model, x <= 10, Int);

julia> @objective(model, Max, x);

julia> function my_callback_function(cb_data)
           status = callback_node_status(cb_data, model)
           if status == MOI.CALLBACK_NODE_STATUS_INTEGER
               println("Solution is: ", callback_value(cb_data, x))
           end
           return
       end
my_callback_function (generic function with 1 method)

julia> set_attribute(model, GLPK.CallbackFunction(), my_callback_function)

julia> optimize!(model)
Solution is: 10.0
Solution is: 10.0
source

check_belongs_to_model

JuMP.check_belongs_to_modelFunction
check_belongs_to_model(x::AbstractJuMPScalar, model::AbstractModel)
check_belongs_to_model(x::AbstractConstraint, model::AbstractModel)

Throw VariableNotOwned if the owner_model of x is not model.

Example

julia> model = Model();

julia> @variable(model, x);

julia> check_belongs_to_model(x, model)

julia> model_2 = Model();

julia> check_belongs_to_model(x, model_2)
ERROR: VariableNotOwned{VariableRef}(x): the variable x cannot be used in this model because
it belongs to a different model.
[...]
source

coefficient

JuMP.coefficientFunction
coefficient(v1::GenericVariableRef{T}, v2::GenericVariableRef{T}) where {T}

Return one(T) if v1 == v2 and zero(T) otherwise.

This is a fallback for other coefficient methods to simplify code in which the expression may be a single variable.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> coefficient(x[1], x[1])
1.0

julia> coefficient(x[1], x[2])
0.0
source
coefficient(a::GenericAffExpr{C,V}, v::V) where {C,V}

Return the coefficient associated with variable v in the affine expression a.

Example

julia> model = Model();

julia> @variable(model, x);

julia> expr = 2.0 * x + 1.0;

julia> coefficient(expr, x)
2.0
source
coefficient(a::GenericQuadExpr{C,V}, v1::V, v2::V) where {C,V}

Return the coefficient associated with the term v1 * v2 in the quadratic expression a.

Note that coefficient(a, v1, v2) is the same as coefficient(a, v2, v1).

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> expr = 2.0 * x[1] * x[2];

julia> coefficient(expr, x[1], x[2])
2.0

julia> coefficient(expr, x[2], x[1])
2.0

julia> coefficient(expr, x[1], x[1])
0.0
source
coefficient(a::GenericQuadExpr{C,V}, v::V) where {C,V}

Return the coefficient associated with variable v in the affine component of a.

Example

julia> model = Model();

julia> @variable(model, x);

julia> expr = 2.0 * x^2 + 3.0 * x;

julia> coefficient(expr, x)
3.0
source

compute_conflict!

JuMP.compute_conflict!Function
compute_conflict!(model::GenericModel)

Compute a conflict if the model is infeasible.

The conflict is also called the Irreducible Infeasible Subsystem (IIS).

If an optimizer has not been set yet (see set_optimizer), a NoOptimizer error is thrown.

The status of the conflict can be checked with the MOI.ConflictStatus model attribute. Then, the status for each constraint can be queried with the MOI.ConstraintConflictStatus attribute.

See also: copy_conflict

Example

julia> using JuMP

julia> model = Model(Gurobi.Optimizer);

julia> set_silent(model)

julia> @variable(model, x >= 0);

julia> @constraint(model, c1, x >= 2);

julia> @constraint(model, c2, x <= 1);

julia> optimize!(model)

julia> compute_conflict!(model)

julia> get_attribute(model, MOI.ConflictStatus())
CONFLICT_FOUND::ConflictStatusCode = 3
source

constant

JuMP.constantFunction
constant(aff::GenericAffExpr{C,V})::C

Return the constant of the affine expression.

Example

julia> model = Model();

julia> @variable(model, x);

julia> aff = 2.0 * x + 3.0;

julia> constant(aff)
3.0
source
constant(quad::GenericQuadExpr{C,V})::C

Return the constant of the quadratic expression.

Example

julia> model = Model();

julia> @variable(model, x);

julia> quad = 2.0 * x^2 + 3.0;

julia> constant(quad)
3.0
source

constraint_by_name

JuMP.constraint_by_nameFunction
constraint_by_name(model::AbstractModel, name::String, [F, S])::Union{ConstraintRef,Nothing}

Return the reference of the constraint with name attribute name or Nothing if no constraint has this name attribute.

Throws an error if several constraints have name as their name attribute.

If F and S are provided, this method addititionally throws an error if the constraint is not an F-in-S contraint where F is either the JuMP or MOI type of the function and S is the MOI type of the set.

Providing F and S is recommended if you know the type of the function and set since its returned type can be inferred while for the method above (that is, without F and S), the exact return type of the constraint index cannot be inferred.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @constraint(model, con, x^2 == 1)
con : x² = 1

julia> constraint_by_name(model, "kon")

julia> constraint_by_name(model, "con")
con : x² = 1

julia> constraint_by_name(model, "con", AffExpr, MOI.EqualTo{Float64})

julia> constraint_by_name(model, "con", QuadExpr, MOI.EqualTo{Float64})
con : x² = 1
source

constraint_object

JuMP.constraint_objectFunction
constraint_object(con_ref::ConstraintRef)

Return the underlying constraint data for the constraint referenced by con_ref.

Example

A scalar constraint:

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, 2x <= 1)
c : 2 x ≤ 1

julia> object = constraint_object(c)
ScalarConstraint{AffExpr, MathOptInterface.LessThan{Float64}}(2 x, MathOptInterface.LessThan{Float64}(1.0))

julia> typeof(object)
ScalarConstraint{AffExpr, MathOptInterface.LessThan{Float64}}

julia> object.func
2 x

julia> object.set
MathOptInterface.LessThan{Float64}(1.0)

A vector constraint:

julia> model = Model();

julia> @variable(model, x[1:3]);

julia> @constraint(model, c, x in SecondOrderCone())
c : [x[1], x[2], x[3]] ∈ MathOptInterface.SecondOrderCone(3)

julia> object = constraint_object(c)
VectorConstraint{VariableRef, MathOptInterface.SecondOrderCone, VectorShape}(VariableRef[x[1], x[2], x[3]], MathOptInterface.SecondOrderCone(3), VectorShape())

julia> typeof(object)
VectorConstraint{VariableRef, MathOptInterface.SecondOrderCone, VectorShape}

julia> object.func
3-element Vector{VariableRef}:
 x[1]
 x[2]
 x[3]

julia> object.set
MathOptInterface.SecondOrderCone(3)
source

constraint_ref_with_index

JuMP.constraint_ref_with_indexFunction
constraint_ref_with_index(model::AbstractModel, index::MOI.ConstraintIndex)

Return a ConstraintRef of model corresponding to index.

This function is a helper function used internally by JuMP and some JuMP extensions. It should not need to be called in user-code.

source

constraint_string

JuMP.constraint_stringFunction
constraint_string(
    mode::MIME,
    ref::ConstraintRef;
    in_math_mode::Bool = false,
)

Return a string representation of the constraint ref, given the mode.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, 2 * x <= 1);

julia> constraint_string(MIME("text/plain"), c)
"c : 2 x ≤ 1"
source

constraints_string

JuMP.constraints_stringFunction
constraints_string(mode, model::AbstractModel)::Vector{String}

Return a list of Strings describing each constraint of the model.

Example

julia> model = Model();

julia> @variable(model, x >= 0);

julia> @constraint(model, c, 2 * x <= 1);

julia> constraints_string(MIME("text/plain"), model)
2-element Vector{String}:
 "c : 2 x ≤ 1"
 "x ≥ 0"
source

copy_conflict

JuMP.copy_conflictFunction
copy_conflict(model::GenericModel)

Return a copy of the current conflict for the model model and a GenericReferenceMap that can be used to obtain the variable and constraint reference of the new model corresponding to a given model's reference.

This is a convenience function that provides a filtering function for copy_model.

Note

Model copy is not supported in DIRECT mode, that is, when a model is constructed using the direct_model constructor instead of the Model constructor. Moreover, independently on whether an optimizer was provided at model construction, the new model will have no optimizer, that is, an optimizer will have to be provided to the new model in the optimize! call.

Example

In the following example, a model model is constructed with a variable x and two constraints c1 and c2. This model has no solution, as the two constraints are mutually exclusive. The solver is asked to compute a conflict with compute_conflict!. The parts of model participating in the conflict are then copied into a model iis_model.

julia> using JuMP

julia> import Gurobi

julia> model = Model(Gurobi.Optimizer);

julia> set_silent(model)

julia> @variable(model, x >= 0)
x

julia> @constraint(model, c1, x >= 2)
c1 : x ≥ 2

julia> @constraint(model, c2, x <= 1)
c2 : x ≤ 1

julia> optimize!(model)

julia> compute_conflict!(model)

julia> if get_attribute(model, MOI.ConflictStatus()) == MOI.CONFLICT_FOUND
           iis_model, reference_map = copy_conflict(model)
           print(iis_model)
       end
Feasibility
Subject to
 c1 : x ≥ 2
 c2 : x ≤ 1
source

copy_extension_data

JuMP.copy_extension_dataFunction
copy_extension_data(data, new_model::AbstractModel, model::AbstractModel)

Return a copy of the extension data data of the model model to the extension data of the new model new_model.

A method should be added for any JuMP extension storing data in the ext field.

This method should only be implemented by developers creating JuMP extensions. It should never be called by users of JuMP.

Warning

Do not engage in type piracy by implementing this method for types of data that you did not define! JuMP extensions should store types that they define in model.ext, rather than regular Julia types.

source

copy_model

JuMP.copy_modelFunction
copy_model(model::GenericModel; filter_constraints::Union{Nothing, Function}=nothing)

Return a copy of the model model and a GenericReferenceMap that can be used to obtain the variable and constraint reference of the new model corresponding to a given model's reference. A Base.copy(::AbstractModel) method has also been implemented, it is similar to copy_model but does not return the reference map.

If the filter_constraints argument is given, only the constraints for which this function returns true will be copied. This function is given a constraint reference as argument.

Note

Model copy is not supported in DIRECT mode, that is, when a model is constructed using the direct_model constructor instead of the Model constructor. Moreover, independently on whether an optimizer was provided at model construction, the new model will have no optimizer, that is, an optimizer will have to be provided to the new model in the optimize! call.

Example

In the following example, a model model is constructed with a variable x and a constraint cref. It is then copied into a model new_model with the new references assigned to x_new and cref_new.

julia> model = Model();

julia> @variable(model, x)
x

julia> @constraint(model, cref, x == 2)
cref : x = 2

julia> new_model, reference_map = copy_model(model);

julia> x_new = reference_map[x]
x

julia> cref_new = reference_map[cref]
cref : x = 2
source

delete

JuMP.deleteFunction
delete(model::GenericModel, con_ref::ConstraintRef)

Delete the constraint associated with constraint_ref from the model model.

Note that delete does not unregister the name from the model, so adding a new constraint of the same name will throw an error. Use unregister to unregister the name after deletion.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, 2x <= 1)
c : 2 x ≤ 1

julia> delete(model, c)

julia> unregister(model, :c)

julia> print(model)
Feasibility
Subject to

julia> model[:c]
ERROR: KeyError: key :c not found
Stacktrace:
[...]
source
delete(model::GenericModel, con_refs::Vector{<:ConstraintRef})

Delete the constraints associated with con_refs from the model model.

Solvers may implement specialized methods for deleting multiple constraints of the same concrete type. These methods may be more efficient than repeatedly calling the single constraint delete method.

See also: unregister

Example

julia> model = Model();

julia> @variable(model, x[1:3]);

julia> @constraint(model, c, 2 * x .<= 1)
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}, ScalarShape}}:
 c : 2 x[1] ≤ 1
 c : 2 x[2] ≤ 1
 c : 2 x[3] ≤ 1

julia> delete(model, c)

julia> unregister(model, :c)

julia> print(model)
Feasibility
Subject to

julia> model[:c]
ERROR: KeyError: key :c not found
Stacktrace:
[...]
source
delete(model::GenericModel, variable_ref::GenericVariableRef)

Delete the variable associated with variable_ref from the model model.

Note that delete does not unregister the name from the model, so adding a new variable of the same name will throw an error. Use unregister to unregister the name after deletion.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> delete(model, x)

julia> unregister(model, :x)

julia> print(model)
Feasibility
Subject to

julia> model[:x]
ERROR: KeyError: key :x not found
Stacktrace:
[...]
source
delete(model::GenericModel, variable_refs::Vector{<:GenericVariableRef})

Delete the variables associated with variable_refs from the model model. Solvers may implement methods for deleting multiple variables that are more efficient than repeatedly calling the single variable delete method.

See also: unregister

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> delete(model, x)

julia> unregister(model, :x)

julia> print(model)
Feasibility
Subject to

julia> model[:x]
ERROR: KeyError: key :x not found
Stacktrace:
[...]
source

delete_lower_bound

delete_upper_bound

direct_generic_model

JuMP.direct_generic_modelFunction
direct_generic_model(
    value_type::Type{T},
    backend::MOI.ModelLike;
) where {T<:Real}

Return a new JuMP model using backend to store the model and solve it.

As opposed to the Model constructor, no cache of the model is stored outside of backend and no bridges are automatically applied to backend.

Notes

The absence of a cache reduces the memory footprint but, it is important to bear in mind the following implications of creating models using this direct mode:

  • When backend does not support an operation, such as modifying constraints or adding variables/constraints after solving, an error is thrown. For models created using the Model constructor, such situations can be dealt with by storing the modifications in a cache and loading them into the optimizer when optimize! is called.
  • No constraint bridging is supported by default.
  • The optimizer used cannot be changed the model is constructed.
  • The model created cannot be copied.
source
direct_generic_model(::Type{T}, factory::MOI.OptimizerWithAttributes)

Create a direct_generic_model using factory, a MOI.OptimizerWithAttributes object created by optimizer_with_attributes.

Example

julia> import HiGHS

julia> optimizer = optimizer_with_attributes(
           HiGHS.Optimizer,
           "presolve" => "off",
           MOI.Silent() => true,
       );

julia> model = direct_generic_model(Float64, optimizer)
A JuMP Model
├ mode: DIRECT
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none

is equivalent to:

julia> import HiGHS

julia> model = direct_generic_model(Float64, HiGHS.Optimizer())
A JuMP Model
├ mode: DIRECT
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none

julia> set_attribute(model, "presolve", "off")

julia> set_attribute(model, MOI.Silent(), true)
source

direct_model

JuMP.direct_modelFunction
direct_model(backend::MOI.ModelLike)

Return a new JuMP model using backend to store the model and solve it.

As opposed to the Model constructor, no cache of the model is stored outside of backend and no bridges are automatically applied to backend.

Notes

The absence of a cache reduces the memory footprint but, it is important to bear in mind the following implications of creating models using this direct mode:

  • When backend does not support an operation, such as modifying constraints or adding variables/constraints after solving, an error is thrown. For models created using the Model constructor, such situations can be dealt with by storing the modifications in a cache and loading them into the optimizer when optimize! is called.
  • No constraint bridging is supported by default.
  • The optimizer used cannot be changed the model is constructed.
  • The model created cannot be copied.
source
direct_model(factory::MOI.OptimizerWithAttributes)

Create a direct_model using factory, a MOI.OptimizerWithAttributes object created by optimizer_with_attributes.

Example

julia> import HiGHS

julia> optimizer = optimizer_with_attributes(
           HiGHS.Optimizer,
           "presolve" => "off",
           MOI.Silent() => true,
       );

julia> model = direct_model(optimizer)
A JuMP Model
├ mode: DIRECT
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none

is equivalent to:

julia> import HiGHS

julia> model = direct_model(HiGHS.Optimizer())
A JuMP Model
├ mode: DIRECT
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none

julia> set_attribute(model, "presolve", "off")

julia> set_attribute(model, MOI.Silent(), true)
source

drop_zeros!

JuMP.drop_zeros!Function
drop_zeros!(expr::GenericAffExpr)

Remove terms in the affine expression with 0 coefficients.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> expr = x[1] + x[2];

julia> add_to_expression!(expr, -1.0, x[1])
0 x[1] + x[2]

julia> drop_zeros!(expr)

julia> expr
x[2]
source
drop_zeros!(expr::GenericQuadExpr)

Remove terms in the quadratic expression with 0 coefficients.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> expr = x[1]^2 + x[2]^2;

julia> add_to_expression!(expr, -1.0, x[1], x[1])
0 x[1]² + x[2]²

julia> drop_zeros!(expr)

julia> expr
x[2]²
source

dual

JuMP.dualFunction
dual(con_ref::ConstraintRef; result::Int = 1)

Return the dual value of constraint con_ref associated with result index result of the most-recent solution returned by the solver.

Use has_duals to check if a result exists before asking for values.

See also: result_count, shadow_price.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x);

julia> @constraint(model, c, x <= 1)
c : x ≤ 1

julia> @objective(model, Max, 2 * x + 1);

julia> optimize!(model)

julia> has_duals(model)
true

julia> dual(c)
-2.0
source

dual_objective_value

JuMP.dual_objective_valueFunction
dual_objective_value(model::GenericModel; result::Int = 1)

Return the value of the objective of the dual problem associated with result index result of the most-recent solution returned by the solver.

Throws MOI.UnsupportedAttribute{MOI.DualObjectiveValue} if the solver does not support this attribute.

This function is equivalent to querying the MOI.DualObjectiveValue attribute.

See also: result_count.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x >= 1);

julia> @objective(model, Min, 2 * x + 1);

julia> optimize!(model)

julia> dual_objective_value(model)
3.0

julia> dual_objective_value(model; result = 2)
ERROR: Result index of attribute MathOptInterface.DualObjectiveValue(2) out of bounds. There are currently 1 solution(s) in the model.
Stacktrace:
[...]
source

dual_shape

JuMP.dual_shapeFunction
dual_shape(shape::AbstractShape)::AbstractShape

Returns the shape of the dual space of the space of objects of shape shape. By default, the dual_shape of a shape is itself. See the examples section below for an example for which this is not the case.

Example

Consider polynomial constraints for which the dual is moment constraints and moment constraints for which the dual is polynomial constraints. Shapes for polynomials can be defined as follows:

struct Polynomial
    coefficients::Vector{Float64}
    monomials::Vector{Monomial}
end
struct PolynomialShape <: AbstractShape
    monomials::Vector{Monomial}
end
JuMP.reshape_vector(x::Vector, shape::PolynomialShape) = Polynomial(x, shape.monomials)

and a shape for moments can be defined as follows:

struct Moments
    coefficients::Vector{Float64}
    monomials::Vector{Monomial}
end
struct MomentsShape <: AbstractShape
    monomials::Vector{Monomial}
end
JuMP.reshape_vector(x::Vector, shape::MomentsShape) = Moments(x, shape.monomials)

Then dual_shape allows the definition of the shape of the dual of polynomial and moment constraints:

dual_shape(shape::PolynomialShape) = MomentsShape(shape.monomials)
dual_shape(shape::MomentsShape) = PolynomialShape(shape.monomials)
source

dual_start_value

JuMP.dual_start_valueFunction
dual_start_value(con_ref::ConstraintRef)

Return the dual start value (MOI attribute ConstraintDualStart) of the constraint con_ref.

If no dual start value has been set, dual_start_value will return nothing.

See also set_dual_start_value.

Example

julia> model = Model();

julia> @variable(model, x, start = 2.0);

julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ Nonnegatives()

julia> set_dual_start_value(c, [0.0])

julia> dual_start_value(c)
1-element Vector{Float64}:
 0.0

julia> set_dual_start_value(c, nothing)

julia> dual_start_value(c)
source

dual_status

JuMP.dual_statusFunction
dual_status(model::GenericModel; result::Int = 1)

Return a MOI.ResultStatusCode describing the status of the most recent dual solution of the solver (that is, the MOI.DualStatus attribute) associated with the result index result.

See also: result_count.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> dual_status(model; result = 2)
NO_SOLUTION::ResultStatusCode = 0
source

error_if_direct_mode

JuMP.error_if_direct_modeFunction
error_if_direct_mode(model::GenericModel, func::Symbol)

Errors if model is in direct mode during a call from the function named func.

Used internally within JuMP, or by JuMP extensions who do not want to support models in direct mode.

Example

julia> import HiGHS

julia> model = direct_model(HiGHS.Optimizer());

julia> error_if_direct_mode(model, :foo)
ERROR: The `foo` function is not supported in DIRECT mode.
Stacktrace:
[...]
source

fix

JuMP.fixFunction
fix(v::GenericVariableRef, value::Number; force::Bool = false)

Fix a variable to a value. Update the fixing constraint if one exists, otherwise create a new one.

If the variable already has variable bounds and force=false, calling fix will throw an error. If force=true, existing variable bounds will be deleted, and the fixing constraint will be added. Note a variable will have no bounds after a call to unfix.

See also FixRef, is_fixed, fix_value, unfix.

Example

julia> model = Model();

julia> @variable(model, x);

julia> is_fixed(x)
false

julia> fix(x, 1.0)

julia> is_fixed(x)
true
julia> model = Model();

julia> @variable(model, 0 <= x <= 1);

julia> is_fixed(x)
false

julia> fix(x, 1.0; force = true)

julia> is_fixed(x)
true
source

fix_discrete_variables

JuMP.fix_discrete_variablesFunction
fix_discrete_variables([var_value::Function = value,] model::GenericModel)

Modifies model to convert all binary and integer variables to continuous variables with fixed bounds of var_value(x).

Return

Returns a function that can be called without any arguments to restore the original model. The behavior of this function is undefined if additional changes are made to the affected variables in the meantime.

Notes

  • An error is thrown if semi-continuous or semi-integer constraints are present (support may be added for these in the future).
  • All other constraints are ignored (left in place). This includes discrete constraints like SOS and indicator constraints.

Example

julia> model = Model();

julia> @variable(model, x, Bin, start = 1);

julia> @variable(model, 1 <= y <= 10, Int, start = 2);

julia> @objective(model, Min, x + y);

julia> undo_relax = fix_discrete_variables(start_value, model);

julia> print(model)
Min x + y
Subject to
 x = 1
 y = 2

julia> undo_relax()

julia> print(model)
Min x + y
Subject to
 y ≥ 1
 y ≤ 10
 y integer
 x binary
source

fix_value

JuMP.fix_valueFunction
fix_value(v::GenericVariableRef)

Return the value to which a variable is fixed.

Error if one does not exist.

See also FixRef, is_fixed, fix, unfix.

Example

julia> model = Model();

julia> @variable(model, x == 1);

julia> fix_value(x)
1.0
source

flatten!

JuMP.flatten!Function
flatten!(expr::GenericNonlinearExpr)

Flatten a nonlinear expression in-place by lifting nested + and * nodes into a single n-ary operation.

Motivation

Nonlinear expressions created using operator overloading can be deeply nested and unbalanced. For example, prod(x for i in 1:4) creates *(x, *(x, *(x, x))) instead of the more preferable *(x, x, x, x).

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> y = prod(x for i in 1:4)
((x²) * x) * x

julia> flatten!(y)
(x²) * x * x

julia> flatten!(sin(prod(x for i in 1:4)))
sin((x²) * x * x)
source

function_string

JuMP.function_stringFunction
function_string(
    mode::MIME,
    func::Union{JuMP.AbstractJuMPScalar,Vector{<:JuMP.AbstractJuMPScalar}},
)

Return a String representing the function func using print mode mode.

Example

julia> model = Model();

julia> @variable(model, x);

julia> function_string(MIME("text/plain"), 2 * x + 1)
"2 x + 1"
source

get_attribute

JuMP.get_attributeFunction
get_attribute(model::GenericModel, attr::MOI.AbstractModelAttribute)
get_attribute(x::GenericVariableRef, attr::MOI.AbstractVariableAttribute)
get_attribute(cr::ConstraintRef, attr::MOI.AbstractConstraintAttribute)

Get the value of a solver-specifc attribute attr.

This is equivalent to calling MOI.get with the associated MOI model and, for variables and constraints, with the associated MOI.VariableIndex or MOI.ConstraintIndex.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @constraint(model, c, 2 * x <= 1)
c : 2 x ≤ 1

julia> get_attribute(model, MOI.Name())
""

julia> get_attribute(x, MOI.VariableName())
"x"

julia> get_attribute(c, MOI.ConstraintName())
"c"
source
get_attribute(
    model::Union{GenericModel,MOI.OptimizerWithAttributes},
    attr::Union{AbstractString,MOI.AbstractOptimizerAttribute},
)

Get the value of a solver-specifc attribute attr.

This is equivalent to calling MOI.get with the associated MOI model.

If attr is an AbstractString, it is converted to MOI.RawOptimizerAttribute.

Example

julia> import HiGHS

julia> opt = optimizer_with_attributes(HiGHS.Optimizer, "output_flag" => true);

julia> model = Model(opt);

julia> get_attribute(model, "output_flag")
true

julia> get_attribute(model, MOI.RawOptimizerAttribute("output_flag"))
true

julia> get_attribute(opt, "output_flag")
true

julia> get_attribute(opt, MOI.RawOptimizerAttribute("output_flag"))
true
source

has_duals

JuMP.has_dualsFunction
has_duals(model::GenericModel; result::Int = 1)

Return true if the solver has a dual solution in result index result available to query, otherwise return false.

See also dual, shadow_price, and result_count.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x);

julia> @constraint(model, c, x <= 1)
c : x ≤ 1

julia> @objective(model, Max, 2 * x + 1);

julia> has_duals(model)
false

julia> optimize!(model)

julia> has_duals(model)
true
source

has_lower_bound

has_start_value

JuMP.has_start_valueFunction
has_start_value(variable::AbstractVariableRef)

Return true if the variable has a start value set, otherwise return false.

See also: start_value, set_start_value.

Example

julia> model = Model();

julia> @variable(model, x, start = 1.5);

julia> @variable(model, y);

julia> has_start_value(x)
true

julia> has_start_value(y)
false

julia> start_value(x)
1.5

julia> set_start_value(y, 2.0)

julia> has_start_value(y)
true

julia> start_value(y)
2.0
source

has_upper_bound

has_values

JuMP.has_valuesFunction
has_values(model::GenericModel; result::Int = 1)

Return true if the solver has a primal solution in result index result available to query, otherwise return false.

See also value and result_count.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x);

julia> @constraint(model, c, x <= 1)
c : x ≤ 1

julia> @objective(model, Max, 2 * x + 1);

julia> has_values(model)
false

julia> optimize!(model)

julia> has_values(model)
true
source

in_set_string

JuMP.in_set_stringFunction
in_set_string(mode::MIME, set)

Return a String representing the membership to the set set using print mode mode.

Extensions

JuMP extensions may extend this method for new set types to improve the legibility of their printing.

Example

julia> in_set_string(MIME("text/plain"), MOI.Interval(1.0, 2.0))
"∈ [1, 2]"
source

index

JuMP.indexFunction
index(cr::ConstraintRef)::MOI.ConstraintIndex

Return the index of the constraint that corresponds to cr in the MOI backend.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, x >= 0);

julia> index(c)
MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.GreaterThan{Float64}}(1)
source
index(v::GenericVariableRef)::MOI.VariableIndex

Return the index of the variable that corresponds to v in the MOI backend.

Example

julia> model = Model();

julia> @variable(model, x);

julia> index(x)
MOI.VariableIndex(1)
source

is_binary

is_fixed

JuMP.is_fixedFunction
is_fixed(v::GenericVariableRef)

Return true if v is a fixed variable. If true, the fixed value can be queried with fix_value.

See also FixRef, fix_value, fix, unfix.

Example

julia> model = Model();

julia> @variable(model, x);

julia> is_fixed(x)
false

julia> fix(x, 1.0)

julia> is_fixed(x)
true
source

is_integer

JuMP.is_integerFunction
is_integer(v::GenericVariableRef)

Return true if v is constrained to be integer.

See also IntegerRef, set_integer, unset_integer.

Example

julia> model = Model();

julia> @variable(model, x);

julia> is_integer(x)
false

julia> set_integer(x)

julia> is_integer(x)
true
source

is_parameter

JuMP.is_parameterFunction
is_parameter(x::GenericVariableRef)::Bool

Return true if x is constrained to be a parameter.

See also ParameterRef, set_parameter_value, parameter_value.

Example

julia> model = Model();

julia> @variable(model, p in Parameter(2))
p

julia> is_parameter(p)
true

julia> @variable(model, x)
x

julia> is_parameter(x)
false
source

is_solved_and_feasible

JuMP.is_solved_and_feasibleFunction
is_solved_and_feasible(
    model::GenericModel;
    allow_local::Bool = true,
    allow_almost::Bool = false,
    dual::Bool = false,
    result::Int = 1,
)

Return true if the model has a feasible primal solution associated with result index result and the termination_status is OPTIMAL (the solver found a global optimum) or LOCALLY_SOLVED (the solver found a local optimum, which may also be the global optimum, but the solver could not prove so).

If allow_local = false, then this function returns true only if the termination_status is OPTIMAL.

If allow_almost = true, then the termination_status may additionally be ALMOST_OPTIMAL or ALMOST_LOCALLY_SOLVED (if allow_local), and the primal_status and dual_status may additionally be NEARLY_FEASIBLE_POINT.

If dual, additionally check that an optimal dual solution is available.

If this function returns false, use termination_status, result_count, primal_status and dual_status to understand what solutions are available (if any).

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> is_solved_and_feasible(model)
false
source

is_valid

JuMP.is_validFunction
is_valid(model::GenericModel, con_ref::ConstraintRef{<:AbstractModel})

Return true if con_ref refers to a valid constraint in model.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, 2 * x <= 1);

julia> is_valid(model, c)
true

julia> model_2 = Model();

julia> is_valid(model_2, c)
false
source
is_valid(model::GenericModel, variable_ref::GenericVariableRef)

Return true if variable refers to a valid variable in model.

Example

julia> model = Model();

julia> @variable(model, x);

julia> is_valid(model, x)
true

julia> model_2 = Model();

julia> is_valid(model_2, x)
false
source

isequal_canonical

JuMP.isequal_canonicalFunction
isequal_canonical(
    x::T,
    y::T
) where {T<:AbstractJuMPScalar,AbstractArray{<:AbstractJuMPScalar}}

Return true if x is equal to y after dropping zeros and disregarding the order.

This method is mainly useful for testing, because fallbacks like x == y do not account for valid mathematical comparisons like x[1] + 0 x[2] + 1 == x[1] + 1.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> a = x[1] + 1.0
x[1] + 1

julia> b = x[1] + x[2] + 1.0
x[1] + x[2] + 1

julia> add_to_expression!(b, -1.0, x[2])
x[1] + 0 x[2] + 1

julia> a == b
false

julia> isequal_canonical(a, b)
true
source

jump_function

JuMP.jump_functionFunction
jump_function(model::AbstractModel, x::MOI.AbstractFunction)

Given an MathOptInterface object x, return the JuMP equivalent.

See also: moi_function.

Example

julia> model = Model();

julia> @variable(model, x);

julia> f = 2.0 * index(x) + 1.0
1.0 + 2.0 MOI.VariableIndex(1)

julia> jump_function(model, f)
2 x + 1
source

jump_function_type

JuMP.jump_function_typeFunction
jump_function_type(model::AbstractModel, ::Type{T}) where {T}

Given an MathOptInterface object type T, return the JuMP equivalent.

See also: moi_function_type.

Example

julia> model = Model();

julia> jump_function_type(model, MOI.ScalarAffineFunction{Float64})
AffExpr (alias for GenericAffExpr{Float64, GenericVariableRef{Float64}})
source

latex_formulation

JuMP.latex_formulationFunction
latex_formulation(model::AbstractModel)

Wrap model in a type so that it can be pretty-printed as text/latex in a notebook like IJulia, or in Documenter.

To render the model, end the cell with latex_formulation(model), or call display(latex_formulation(model)) in to force the display of the model from inside a function.

source

linear_terms

JuMP.linear_termsFunction
linear_terms(aff::GenericAffExpr{C,V})

Provides an iterator over coefficient-variable tuples (a_i::C, x_i::V) in the linear part of the affine expression.

source
linear_terms(quad::GenericQuadExpr{C,V})

Provides an iterator over tuples (coefficient::C, variable::V) in the linear part of the quadratic expression.

source

list_of_constraint_types

JuMP.list_of_constraint_typesFunction
list_of_constraint_types(model::GenericModel)::Vector{Tuple{Type,Type}}

Return a list of tuples of the form (F, S) where F is a JuMP function type and S is an MOI set type such that all_constraints(model, F, S) returns a nonempty list.

Example

julia> model = Model();

julia> @variable(model, x >= 0, Bin);

julia> @constraint(model, 2x <= 1);

julia> list_of_constraint_types(model)
3-element Vector{Tuple{Type, Type}}:
 (AffExpr, MathOptInterface.LessThan{Float64})
 (VariableRef, MathOptInterface.GreaterThan{Float64})
 (VariableRef, MathOptInterface.ZeroOne)

Performance considerations

Iterating over the list of function and set types is a type-unstable operation. Consider using a function barrier. See the Performance tips for extensions section of the documentation for more details.

source

lower_bound

lp_matrix_data

JuMP.lp_matrix_dataFunction
lp_matrix_data(model::GenericModel{T})

Given a JuMP model of a linear program, return an LPMatrixData{T} struct storing data for an equivalent linear program in the form:

\[\begin{aligned} \min & c^\top x + c_0 \\ & b_l \le A x \le b_u \\ & x_l \le x \le x_u \end{aligned}\]

where elements in x may be continuous, integer, or binary variables.

Fields

The struct returned by lp_matrix_data has the fields:

  • A::SparseArrays.SparseMatrixCSC{T,Int}: the constraint matrix in sparse matrix form.
  • b_lower::Vector{T}: the dense vector of row lower bounds. If missing, the value of typemin(T) is used.
  • b_upper::Vector{T}: the dense vector of row upper bounds. If missing, the value of typemax(T) is used.
  • x_lower::Vector{T}: the dense vector of variable lower bounds. If missing, the value of typemin(T) is used.
  • x_upper::Vector{T}: the dense vector of variable upper bounds. If missing, the value of typemax(T) is used.
  • c::Vector{T}: the dense vector of linear objective coefficients
  • c_offset::T: the constant term in the objective function.
  • sense::MOI.OptimizationSense: the objective sense of the model.
  • integers::Vector{Int}: the sorted list of column indices that are integer variables.
  • binaries::Vector{Int}: the sorted list of column indices that are binary variables.
  • variables::Vector{GenericVariableRef{T}}: a vector of GenericVariableRef, corresponding to order of the columns in the matrix form.
  • affine_constraints::Vector{ConstraintRef}: a vector of ConstraintRef, corresponding to the order of rows in the matrix form.

Limitations

The models supported by lp_matrix_data are intentionally limited to linear programs.

Example

julia> model = Model();

julia> @variable(model, x[1:2] >= 0);

julia> @constraint(model, x[1] + 2 * x[2] <= 1);

julia> @objective(model, Max, x[2]);

julia> data = lp_matrix_data(model);

julia> data.A
1×2 SparseArrays.SparseMatrixCSC{Float64, Int64} with 2 stored entries:
 1.0  2.0

julia> data.b_lower
1-element Vector{Float64}:
 -Inf

julia> data.b_upper
1-element Vector{Float64}:
 1.0

julia> data.x_lower
2-element Vector{Float64}:
 0.0
 0.0

julia> data.x_upper
2-element Vector{Float64}:
 Inf
 Inf

julia> data.c
2-element Vector{Float64}:
 0.0
 1.0

julia> data.c_offset
0.0

julia> data.sense
MAX_SENSE::OptimizationSense = 1
source

lp_sensitivity_report

JuMP.lp_sensitivity_reportFunction
lp_sensitivity_report(model::GenericModel{T}; atol::T = Base.rtoldefault(T))::SensitivityReport{T} where {T}

Given a linear program model with a current optimal basis, return a SensitivityReport object, which maps:

  • Every variable reference to a tuple (d_lo, d_hi)::Tuple{T,T}, explaining how much the objective coefficient of the corresponding variable can change by, such that the original basis remains optimal.
  • Every constraint reference to a tuple (d_lo, d_hi)::Tuple{T,T}, explaining how much the right-hand side of the corresponding constraint can change by, such that the basis remains optimal.

Both tuples are relative, rather than absolute. So given a objective coefficient of 1.0 and a tuple (-0.5, 0.5), the objective coefficient can range between 1.0 - 0.5 an 1.0 + 0.5.

atol is the primal/dual optimality tolerance, and should match the tolerance of the solver used to compute the basis.

Note: interval constraints are NOT supported.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, -1 <= x <= 2)
x

julia> @objective(model, Min, x)
x

julia> optimize!(model)

julia> report = lp_sensitivity_report(model; atol = 1e-7);

julia> dx_lo, dx_hi = report[x]
(-1.0, Inf)

julia> println(
           "The objective coefficient of `x` can decrease by $dx_lo or " *
           "increase by $dx_hi."
       )
The objective coefficient of `x` can decrease by -1.0 or increase by Inf.

julia> dRHS_lo, dRHS_hi = report[LowerBoundRef(x)]
(-Inf, 3.0)

julia> println(
           "The lower bound of `x` can decrease by $dRHS_lo or increase " *
           "by $dRHS_hi."
       )
The lower bound of `x` can decrease by -Inf or increase by 3.0.
source

map_coefficients

JuMP.map_coefficientsFunction
map_coefficients(f::Function, a::GenericAffExpr)

Apply f to the coefficients and constant term of an GenericAffExpr a and return a new expression.

See also: map_coefficients_inplace!

Example

julia> model = Model();

julia> @variable(model, x);

julia> a = GenericAffExpr(1.0, x => 1.0)
x + 1

julia> map_coefficients(c -> 2 * c, a)
2 x + 2

julia> a
x + 1
source
map_coefficients(f::Function, a::GenericQuadExpr)

Apply f to the coefficients and constant term of an GenericQuadExpr a and return a new expression.

See also: map_coefficients_inplace!

Example

julia> model = Model();

julia> @variable(model, x);

julia> a = @expression(model, x^2 + x + 1)
x² + x + 1

julia> map_coefficients(c -> 2 * c, a)
2 x² + 2 x + 2

julia> a
x² + x + 1
source

map_coefficients_inplace!

JuMP.map_coefficients_inplace!Function
map_coefficients_inplace!(f::Function, a::GenericAffExpr)

Apply f to the coefficients and constant term of an GenericAffExpr a and update them in-place.

See also: map_coefficients

Example

julia> model = Model();

julia> @variable(model, x);

julia> a = GenericAffExpr(1.0, x => 1.0)
x + 1

julia> map_coefficients_inplace!(c -> 2 * c, a)
2 x + 2

julia> a
2 x + 2
source
map_coefficients_inplace!(f::Function, a::GenericQuadExpr)

Apply f to the coefficients and constant term of an GenericQuadExpr a and update them in-place.

See also: map_coefficients

Example

julia> model = Model();

julia> @variable(model, x);

julia> a = @expression(model, x^2 + x + 1)
x² + x + 1

julia> map_coefficients_inplace!(c -> 2 * c, a)
2 x² + 2 x + 2

julia> a
2 x² + 2 x + 2
source

mode

JuMP.modeFunction
mode(model::GenericModel)

Return the ModelMode of model.

Example

julia> model = Model();

julia> mode(model)
AUTOMATIC::ModelMode = 0
source

model_convert

JuMP.model_convertFunction
model_convert(
    model::AbstractModel,
    rhs::Union{
        AbstractConstraint,
        Number,
        AbstractJuMPScalar,
        MOI.AbstractSet,
    },
)

Convert the coefficients and constants of functions and sets in the rhs to the coefficient type value_type(typeof(model)).

Purpose

Creating and adding a constraint is a two-step process. The first step calls build_constraint, and the result of that is passed to add_constraint.

However, because build_constraint does not take the model as an argument, the coefficients and constants of the function or set might be different than value_type(typeof(model)).

Therefore, the result of build_constraint is converted in a call to model_convert before the result is passed to add_constraint.

source

model_string

JuMP.model_stringFunction
model_string(mode::MIME, model::AbstractModel)

Return a String representation of model given the mode.

Example

julia> model = Model();

julia> @variable(model, x >= 0);

julia> print(model_string(MIME("text/plain"), model))
Feasibility
Subject to
 x ≥ 0
source

moi_function

JuMP.moi_functionFunction
moi_function(x::AbstractJuMPScalar)
moi_function(x::AbstractArray{<:AbstractJuMPScalar})

Given a JuMP object x, return the MathOptInterface equivalent.

See also: jump_function.

Example

julia> model = Model();

julia> @variable(model, x);

julia> f = 2.0 * x + 1.0
2 x + 1

julia> moi_function(f)
1.0 + 2.0 MOI.VariableIndex(1)
source

moi_function_type

JuMP.moi_function_typeFunction
moi_function_type(::Type{T}) where {T}

Given a JuMP object type T, return the MathOptInterface equivalent.

See also: jump_function_type.

Example

julia> moi_function_type(AffExpr)
MathOptInterface.ScalarAffineFunction{Float64}
source

moi_set

JuMP.moi_setFunction
moi_set(constraint::AbstractConstraint)

Return the set of the constraint constraint in the function-in-set form as a MathOptInterface.AbstractSet.

moi_set(s::AbstractVectorSet, dim::Int)

Returns the MOI set of dimension dim corresponding to the JuMP set s.

moi_set(s::AbstractScalarSet)

Returns the MOI set corresponding to the JuMP set s.

source

name

JuMP.nameFunction
name(con_ref::ConstraintRef)

Get a constraint's name attribute.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ Nonnegatives()

julia> name(c)
"c"
source
name(v::GenericVariableRef)::String

Get a variable's name attribute.

Example

julia> model = Model();

julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
 x[1]
 x[2]

julia> name(x[1])
"x[1]"
source
name(model::AbstractModel)

Return the MOI.Name attribute of model's backend, or a default if empty.

Example

julia> model = Model();

julia> name(model)
"A JuMP Model"
source

node_count

JuMP.node_countFunction
node_count(model::GenericModel)

If available, returns the total number of branch-and-bound nodes explored during the most recent optimization in a Mixed Integer Program (the MOI.NodeCount attribute).

Throws a MOI.GetAttributeNotAllowed error if the attribute is not implemented by the solver.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> optimize!(model)

julia> node_count(model)
0
source

normalized_coefficient

JuMP.normalized_coefficientFunction
normalized_coefficient(
    constraint::ConstraintRef,
    variable::GenericVariableRef,
)

Return the coefficient associated with variable in constraint after JuMP has normalized the constraint into its standard form.

See also set_normalized_coefficient.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @constraint(model, con, 2x + 3x <= 2)
con : 5 x ≤ 2

julia> normalized_coefficient(con, x)
5.0

julia> @constraint(model, con_vec, [x, 2x + 1, 3] >= 0)
con_vec : [x, 2 x + 1, 3] ∈ Nonnegatives()

julia> normalized_coefficient(con_vec, x)
2-element Vector{Tuple{Int64, Float64}}:
 (1, 1.0)
 (2, 2.0)
source
normalized_coefficient(
    constraint::ConstraintRef,
    variable_1::GenericVariableRef,
    variable_2::GenericVariableRef,
)

Return the quadratic coefficient associated with variable_1 and variable_2 in constraint after JuMP has normalized the constraint into its standard form.

See also set_normalized_coefficient.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> @constraint(model, con, 2x[1]^2 + 3 * x[1] * x[2] + x[2] <= 2)
con : 2 x[1]² + 3 x[1]*x[2] + x[2] ≤ 2

julia> normalized_coefficient(con, x[1], x[1])
2.0

julia> normalized_coefficient(con, x[1], x[2])
3.0

julia> @constraint(model, con_vec, x.^2 <= [1, 2])
con_vec : [x[1]² - 1, x[2]² - 2] ∈ Nonpositives()

julia> normalized_coefficient(con_vec, x[1], x[1])
1-element Vector{Tuple{Int64, Float64}}:
 (1, 1.0)

julia> normalized_coefficient(con_vec, x[1], x[2])
Tuple{Int64, Float64}[]
source

normalized_rhs

JuMP.normalized_rhsFunction
normalized_rhs(constraint::ConstraintRef)

Return the right-hand side term of constraint after JuMP has converted the constraint into its normalized form.

See also set_normalized_rhs.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, con, 2x + 1 <= 2)
con : 2 x ≤ 1

julia> normalized_rhs(con)
1.0
source

num_constraints

JuMP.num_constraintsFunction
num_constraints(model::GenericModel, function_type, set_type)::Int64

Return the number of constraints currently in the model where the function has type function_type and the set has type set_type.

See also list_of_constraint_types and all_constraints.

Example

julia> model = Model();

julia> @variable(model, x >= 0, Bin);

julia> @variable(model, y);

julia> @constraint(model, y in MOI.GreaterThan(1.0));

julia> @constraint(model, y <= 1.0);

julia> @constraint(model, 2x <= 1);

julia> num_constraints(model, VariableRef, MOI.GreaterThan{Float64})
2

julia> num_constraints(model, VariableRef, MOI.ZeroOne)
1

julia> num_constraints(model, AffExpr, MOI.LessThan{Float64})
2
source
num_constraints(model::GenericModel; count_variable_in_set_constraints::Bool)

Return the number of constraints in model.

If count_variable_in_set_constraints == true, then VariableRef constraints such as VariableRef-in-Integer are included. To count only the number of structural constraints (for example, the rows in the constraint matrix of a linear program), pass count_variable_in_set_constraints = false.

Example

julia> model = Model();

julia> @variable(model, x >= 0, Int);

julia> @constraint(model, 2x <= 1);

julia> num_constraints(model; count_variable_in_set_constraints = true)
3

julia> num_constraints(model; count_variable_in_set_constraints = false)
1
source

num_variables

JuMP.num_variablesFunction
num_variables(model::GenericModel)::Int64

Returns number of variables in model.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> num_variables(model)
2
source

object_dictionary

JuMP.object_dictionaryFunction
object_dictionary(model::GenericModel)

Return the dictionary that maps the symbol name of a variable, constraint, or expression to the corresponding object.

Objects are registered to a specific symbol in the macros. For example, @variable(model, x[1:2, 1:2]) registers the array of variables x to the symbol :x.

This method should be defined for any subtype of AbstractModel.

See also: unregister.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> object_dictionary(model)
Dict{Symbol, Any} with 1 entry:
  :x => VariableRef[x[1], x[2]]
source

objective_bound

JuMP.objective_boundFunction
objective_bound(model::GenericModel)

Return the best known bound on the optimal objective value after a call to optimize!(model).

For scalar-valued objectives, this function returns a Float64. For vector-valued objectives, it returns a Vector{Float64}.

In the case of a vector-valued objective, this returns the ideal point, that is, the point obtained if each objective was optimized independently.

This function is equivalent to querying the MOI.ObjectiveBound attribute.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x >= 1, Int);

julia> @objective(model, Min, 2 * x + 1);

julia> optimize!(model)

julia> objective_bound(model)
3.0
source

objective_function

JuMP.objective_functionFunction
objective_function(
    model::GenericModel,
    ::Type{F} = objective_function_type(model),
) where {F}

Return an object of type F representing the objective function.

Errors if the objective is not convertible to type F.

This function is equivalent to querying the MOI.ObjectiveFunction{F} attribute.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @objective(model, Min, 2x + 1)
2 x + 1

julia> objective_function(model, AffExpr)
2 x + 1

julia> objective_function(model, QuadExpr)
2 x + 1

julia> typeof(objective_function(model, QuadExpr))
QuadExpr (alias for GenericQuadExpr{Float64, GenericVariableRef{Float64}})

We see with the last two commands that even if the objective function is affine, as it is convertible to a quadratic function, it can be queried as a quadratic function and the result is quadratic.

However, it is not convertible to a variable:

julia> objective_function(model, VariableRef)
ERROR: InexactError: convert(MathOptInterface.VariableIndex, 1.0 + 2.0 MOI.VariableIndex(1))
[...]
source

objective_function_string

JuMP.objective_function_stringFunction
objective_function_string(mode, model::AbstractModel)::String

Return a String describing the objective function of the model.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @objective(model, Min, 2 * x);

julia> objective_function_string(MIME("text/plain"), model)
"2 x"
source

objective_function_type

JuMP.objective_function_typeFunction
objective_function_type(model::GenericModel)::AbstractJuMPScalar

Return the type of the objective function.

This function is equivalent to querying the MOI.ObjectiveFunctionType attribute.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @objective(model, Min, 2 * x + 1);

julia> objective_function_type(model)
AffExpr (alias for GenericAffExpr{Float64, GenericVariableRef{Float64}})
source

objective_sense

JuMP.objective_senseFunction
objective_sense(model::GenericModel)::MOI.OptimizationSense

Return the objective sense.

This function is equivalent to querying the MOI.ObjectiveSense attribute.

Example

julia> model = Model();

julia> objective_sense(model)
FEASIBILITY_SENSE::OptimizationSense = 2

julia> @variable(model, x);

julia> @objective(model, Max, x)
x

julia> objective_sense(model)
MAX_SENSE::OptimizationSense = 1
source

objective_value

JuMP.objective_valueFunction
objective_value(model::GenericModel; result::Int = 1)

Return the objective value associated with result index result of the most-recent solution returned by the solver.

For scalar-valued objectives, this function returns a Float64. For vector-valued objectives, it returns a Vector{Float64}.

This function is equivalent to querying the MOI.ObjectiveValue attribute.

See also: result_count.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x >= 1);

julia> @objective(model, Min, 2 * x + 1);

julia> optimize!(model)

julia> objective_value(model)
3.0

julia> objective_value(model; result = 2)
ERROR: Result index of attribute MathOptInterface.ObjectiveValue(2) out of bounds. There are currently 1 solution(s) in the model.
Stacktrace:
[...]
source

op_ifelse

JuMP.op_ifelseFunction
op_ifelse(a, x, y)

A function that falls back to ifelse(a, x, y), but when called with a JuMP variables or expression in the first argument, returns a GenericNonlinearExpr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> op_ifelse(true, 1.0, 2.0)
1.0

julia> op_ifelse(x, 1.0, 2.0)
ifelse(x, 1.0, 2.0)

julia> op_ifelse(true, x, 2.0)
x
source

op_string

JuMP.op_stringFunction
op_string(mime::MIME, x::GenericNonlinearExpr, ::Val{op}) where {op}

Return the string that should be printed for the operator op when function_string is called with mime and x.

Example

julia> model = Model();

julia> @variable(model, x[1:2], Bin);

julia> f = @expression(model, x[1] || x[2]);

julia> op_string(MIME("text/plain"), f, Val(:||))
"||"
source

operator_to_set

JuMP.operator_to_setFunction
operator_to_set(error_fn::Function, ::Val{sense_symbol})

Converts a sense symbol to a set set such that @constraint(model, func sense_symbol 0) is equivalent to @constraint(model, func in set) for any func::AbstractJuMPScalar.

Example

Once a custom set is defined you can directly create a JuMP constraint with it:

julia> struct CustomSet{T} <: MOI.AbstractScalarSet
           value::T
       end

julia> Base.copy(x::CustomSet) = CustomSet(x.value)

julia> model = Model();

julia> @variable(model, x)
x

julia> cref = @constraint(model, x in CustomSet(1.0))
x ∈ CustomSet{Float64}(1.0)

However, there might be an appropriate sign that could be used in order to provide a more convenient syntax:

julia> JuMP.operator_to_set(::Function, ::Val{:⊰}) = CustomSet(0.0)

julia> MOIU.supports_shift_constant(::Type{<:CustomSet}) = true

julia> MOIU.shift_constant(set::CustomSet, value) = CustomSet(set.value + value)

julia> cref = @constraint(model, x ⊰ 1)
x ∈ CustomSet{Float64}(1.0)

Note that the whole function is first moved to the right-hand side, then the sign is transformed into a set with zero constant and finally the constant is moved to the set with MOIU.shift_constant.

source

operator_warn

JuMP.operator_warnFunction
operator_warn(model::AbstractModel)
operator_warn(model::GenericModel)

This function is called on the model whenever two affine expressions are added together without using destructive_add!, and at least one of the two expressions has more than 50 terms.

For the case of Model, if this function is called more than 20,000 times then a warning is generated once.

This method should only be implemented by developers creating JuMP extensions. It should never be called by users of JuMP.

source

optimize!

JuMP.optimize!Function
optimize!(
    model::GenericModel;
    ignore_optimize_hook = (model.optimize_hook === nothing),
    kwargs...,
)

Optimize the model.

If an optimizer has not been set yet (see set_optimizer), a NoOptimizer error is thrown.

If ignore_optimize_hook == true, the optimize hook is ignored and the model is solved as if the hook was not set. Keyword arguments kwargs are passed to the optimize_hook. An error is thrown if optimize_hook is nothing and keyword arguments are provided.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> function my_optimize_hook(model; foo)
           println("Hook called with foo = ", foo)
           return optimize!(model; ignore_optimize_hook = true)
       end
my_optimize_hook (generic function with 1 method)

julia> set_optimize_hook(model, my_optimize_hook)
my_optimize_hook (generic function with 1 method)

julia> optimize!(model; foo = 2)
Hook called with foo = 2
source

optimizer_index

JuMP.optimizer_indexFunction
optimizer_index(x::GenericVariableRef)::MOI.VariableIndex
optimizer_index(x::ConstraintRef{<:GenericModel})::MOI.ConstraintIndex

Return the variable or constraint index that corresponds to x in the associated model unsafe_backend(owner_model(x)).

This function should be used with unsafe_backend.

As a safer alternative, use backend and index. See the docstrings of backend and unsafe_backend for more details.

Throws

  • Throws NoOptimizer if no optimizer is set.
  • Throws an ErrorException if the optimizer is set but is not attached.
  • Throws an ErrorException if the index is bridged.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x >= 0)
x

julia> MOI.Utilities.attach_optimizer(model)

julia> highs = unsafe_backend(model)
A HiGHS model with 1 columns and 0 rows.

julia> optimizer_index(x)
MOI.VariableIndex(1)
source

optimizer_with_attributes

JuMP.optimizer_with_attributesFunction
optimizer_with_attributes(optimizer_constructor, attrs::Pair...)

Groups an optimizer constructor with the list of attributes attrs. Note that it is equivalent to MOI.OptimizerWithAttributes.

When provided to the Model constructor or to set_optimizer, it creates an optimizer by calling optimizer_constructor(), and then sets the attributes using set_attribute.

See also: set_attribute, get_attribute.

Note

The string names of the attributes are specific to each solver. One should consult the solver's documentation to find the attributes of interest.

Example

julia> import HiGHS

julia> optimizer = optimizer_with_attributes(
           HiGHS.Optimizer, "presolve" => "off", MOI.Silent() => true,
       );

julia> model = Model(optimizer);

is equivalent to:

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_attribute(model, "presolve", "off")

julia> set_attribute(model, MOI.Silent(), true)
source

owner_model

JuMP.owner_modelFunction
owner_model(s::AbstractJuMPScalar)

Return the model owning the scalar s.

Example

julia> model = Model();

julia> @variable(model, x);

julia> owner_model(x) === model
true
source

parameter_value

JuMP.parameter_valueFunction
parameter_value(x::GenericVariableRef)

Return the value of the parameter x.

Errors if x is not a parameter.

See also ParameterRef, is_parameter, set_parameter_value.

Example

julia> model = Model();

julia> @variable(model, p in Parameter(2))
p

julia> parameter_value(p)
2.0

julia> set_parameter_value(p, 2.5)

julia> parameter_value(p)
2.5
source

parse_constraint

JuMP.parse_constraintFunction
parse_constraint(error_fn::Function, expr::Expr)

The entry-point for all constraint-related parsing.

Arguments

  • The error_fn function is passed everywhere to provide better error messages
  • expr comes from the @constraint macro. There are two possibilities:
    • @constraint(model, expr)
    • @constraint(model, name[args], expr)
    In both cases, expr is the main component of the constraint.

Supported syntax

JuMP currently supports the following expr objects:

  • lhs <= rhs
  • lhs == rhs
  • lhs >= rhs
  • l <= body <= u
  • u >= body >= l
  • lhs ⟂ rhs
  • lhs in rhs
  • lhs ∈ rhs
  • z --> {constraint}
  • !z --> {constraint}
  • z <--> {constraint}
  • !z <--> {constraint}
  • z => {constraint}
  • !z => {constraint}

as well as all broadcasted variants.

Extensions

The infrastructure behind parse_constraint is extendable. See parse_constraint_head and parse_constraint_call for details.

source

parse_constraint_call

JuMP.parse_constraint_callFunction
parse_constraint_call(
    error_fn::Function,
    is_vectorized::Bool,
    ::Val{op},
    args...,
)

Implement this method to intercept the parsing of a :call expression with operator op.

Warning

Extending the constraint macro at parse time is an advanced operation and has the potential to interfere with existing JuMP syntax. Please discuss with the developer chatroom before publishing any code that implements these methods.

Arguments

  • error_fn: a function that accepts a String and throws the string as an error, along with some descriptive information of the macro from which it was thrown.
  • is_vectorized: a boolean to indicate if op should be broadcast or not
  • op: the first element of the .args field of the Expr to intercept
  • args...: the .args field of the Expr.

Returns

This function must return:

  • parse_code::Expr: an expression containing any setup or rewriting code that needs to be called before build_constraint
  • build_code::Expr: an expression that calls build_constraint( or build_constraint.( depending on is_vectorized.

See also: parse_constraint_head, build_constraint

source
parse_constraint_call(
    error_fn::Function,
    vectorized::Bool,
    ::Val{op},
    lhs,
    rhs,
) where {op}

Fallback handler for binary operators. These might be infix operators like @constraint(model, lhs op rhs), or normal operators like @constraint(model, op(lhs, rhs)).

In both cases, we rewrite as lhs - rhs in operator_to_set(error_fn, op).

See operator_to_set for details.

source

parse_constraint_head

JuMP.parse_constraint_headFunction
parse_constraint_head(error_fn::Function, ::Val{head}, args...)

Implement this method to intercept the parsing of an expression with head head.

Warning

Extending the constraint macro at parse time is an advanced operation and has the potential to interfere with existing JuMP syntax. Please discuss with the developer chatroom before publishing any code that implements these methods.

Arguments

  • error_fn: a function that accepts a String and throws the string as an error, along with some descriptive information of the macro from which it was thrown.
  • head: the .head field of the Expr to intercept
  • args...: the .args field of the Expr.

Returns

This function must return:

  • is_vectorized::Bool: whether the expression represents a broadcasted expression like x .<= 1
  • parse_code::Expr: an expression containing any setup or rewriting code that needs to be called before build_constraint
  • build_code::Expr: an expression that calls build_constraint( or build_constraint.( depending on is_vectorized.

Existing implementations

JuMP currently implements:

  • ::Val{:call}, which forwards calls to parse_constraint_call
  • ::Val{:comparison}, which handles the special case of l <= body <= u.

See also: parse_constraint_call, build_constraint

source

parse_one_operator_variable

JuMP.parse_one_operator_variableFunction
parse_one_operator_variable(
    error_fn::Function,
    info_expr::_VariableInfoExpr,
    sense::Val{S},
    value,
) where {S}

Update infoexr for a variable expression in the @variable macro of the form variable name S value.

source

parse_ternary_variable

JuMP.parse_ternary_variableFunction
parse_ternary_variable(error_fn, info_expr, lhs_sense, lhs, rhs_sense, rhs)

A hook for JuMP extensions to intercept the parsing of a :comparison expression, which has the form lhs lhs_sense variable rhs_sense rhs.

source

parse_variable

JuMP.parse_variableFunction
parse_variable(error_fn::Function, ::_VariableInfoExpr, args...)

A hook for extensions to intercept the parsing of inequality constraints in the @variable macro.

source

primal_feasibility_report

JuMP.primal_feasibility_reportFunction
primal_feasibility_report(
    model::GenericModel{T},
    point::AbstractDict{GenericVariableRef{T},T} = _last_primal_solution(model),
    atol::T = zero(T),
    skip_missing::Bool = false,
)::Dict{Any,T}

Given a dictionary point, which maps variables to primal values, return a dictionary whose keys are the constraints with an infeasibility greater than the supplied tolerance atol. The value corresponding to each key is the respective infeasibility. Infeasibility is defined as the distance between the primal value of the constraint (see MOI.ConstraintPrimal) and the nearest point by Euclidean distance in the corresponding set.

Notes

  • If skip_missing = true, constraints containing variables that are not in point will be ignored.
  • If skip_missing = false and a partial primal solution is provided, an error will be thrown.
  • If no point is provided, the primal solution from the last time the model was solved is used.

Example

julia> model = Model();

julia> @variable(model, 0.5 <= x <= 1);

julia> primal_feasibility_report(model, Dict(x => 0.2))
Dict{Any, Float64} with 1 entry:
  x ≥ 0.5 => 0.3
source
primal_feasibility_report(
    point::Function,
    model::GenericModel{T};
    atol::T = zero(T),
    skip_missing::Bool = false,
) where {T}

A form of primal_feasibility_report where a function is passed as the first argument instead of a dictionary as the second argument.

Example

julia> model = Model();

julia> @variable(model, 0.5 <= x <= 1, start = 1.3);

julia> primal_feasibility_report(model) do v
           return start_value(v)
       end
Dict{Any, Float64} with 1 entry:
  x ≤ 1 => 0.3
source

primal_status

JuMP.primal_statusFunction
primal_status(model::GenericModel; result::Int = 1)

Return a MOI.ResultStatusCode describing the status of the most recent primal solution of the solver (that is, the MOI.PrimalStatus attribute) associated with the result index result.

See also: result_count.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> primal_status(model; result = 2)
NO_SOLUTION::ResultStatusCode = 0
source
JuMP.print_active_bridgesFunction
print_active_bridges([io::IO = stdout,] model::GenericModel)

Print a list of the variable, constraint, and objective bridges that are currently used in the model.

source
print_active_bridges([io::IO = stdout,] model::GenericModel, ::Type{F}) where {F}

Print a list of bridges required for an objective function of type F.

source
print_active_bridges(
    [io::IO = stdout,]
    model::GenericModel,
    F::Type,
    S::Type{<:MOI.AbstractSet},
)

Print a list of bridges required for a constraint of type F-in-S.

source
print_active_bridges(
    [io::IO = stdout,]
    model::GenericModel,
    S::Type{<:MOI.AbstractSet},
)

Print a list of bridges required to add a variable constrained to the set S.

source
JuMP.print_bridge_graphFunction
 print_bridge_graph([io::IO,] model::GenericModel)

Print the hyper-graph containing all variable, constraint, and objective types that could be obtained by bridging the variables, constraints, and objectives that are present in the model.

Warning

This function is intended for advanced users. If you want to see only the bridges that are currently used, use print_active_bridges instead.

Explanation of output

Each node in the hyper-graph corresponds to a variable, constraint, or objective type.

  • Variable nodes are indicated by [ ]
  • Constraint nodes are indicated by ( )
  • Objective nodes are indicated by | |

The number inside each pair of brackets is an index of the node in the hyper-graph.

Note that this hyper-graph is the full list of possible transformations. When the bridged model is created, we select the shortest hyper-path(s) from this graph, so many nodes may be un-used.

For more information, see Legat, B., Dowson, O., Garcia, J., and Lubin, M. (2020). "MathOptInterface: a data structure for mathematical optimization problems." URL: https://arxiv.org/abs/2002.03447

source

quad_terms

JuMP.quad_termsFunction
quad_terms(quad::GenericQuadExpr{C,V})

Provides an iterator over tuples (coefficient::C, var_1::V, var_2::V) in the quadratic part of the quadratic expression.

source

raw_status

JuMP.raw_statusFunction
raw_status(model::GenericModel)

Return the reason why the solver stopped in its own words (that is, the MathOptInterface model attribute MOI.RawStatusString).

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> raw_status(model)
"optimize not called"
source

read_from_file

JuMP.read_from_fileFunction
read_from_file(
    filename::String;
    format::MOI.FileFormats.FileFormat = MOI.FileFormats.FORMAT_AUTOMATIC,
    kwargs...,
)

Return a JuMP model read from filename in the format format.

If the filename ends in .gz, it will be uncompressed using GZip. If the filename ends in .bz2, it will be uncompressed using BZip2.

Other kwargs are passed to the Model constructor of the chosen format.

source

reduced_cost

JuMP.reduced_costFunction
reduced_cost(x::GenericVariableRef{T})::T where {T}

Return the reduced cost associated with variable x.

One interpretation of the reduced cost is that it is the change in the objective from an infinitesimal relaxation of the variable bounds.

This method is equivalent to querying the shadow price of the active variable bound (if one exists and is active).

See also: shadow_price.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x <= 1);

julia> @objective(model, Max, 2 * x + 1);

julia> optimize!(model)

julia> has_duals(model)
true

julia> reduced_cost(x)
2.0
source

relative_gap

JuMP.relative_gapFunction
relative_gap(model::GenericModel)

Return the final relative optimality gap after a call to optimize!(model).

Exact value depends upon implementation of MOI.RelativeGap by the particular solver used for optimization.

This function is equivalent to querying the MOI.RelativeGap attribute.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x >= 1, Int);

julia> @objective(model, Min, 2 * x + 1);

julia> optimize!(model)

julia> relative_gap(model)
0.0
source

relax_integrality

JuMP.relax_integralityFunction
relax_integrality(model::GenericModel)

Modifies model to "relax" all binary and integrality constraints on variables. Specifically,

  • Binary constraints are deleted, and variable bounds are tightened if necessary to ensure the variable is constrained to the interval $[0, 1]$.
  • Integrality constraints are deleted without modifying variable bounds.
  • An error is thrown if semi-continuous or semi-integer constraints are present (support may be added for these in the future).
  • All other constraints are ignored (left in place). This includes discrete constraints like SOS and indicator constraints.

Returns a function that can be called without any arguments to restore the original model. The behavior of this function is undefined if additional changes are made to the affected variables in the meantime.

Example

julia> model = Model();

julia> @variable(model, x, Bin);

julia> @variable(model, 1 <= y <= 10, Int);

julia> @objective(model, Min, x + y);

julia> undo_relax = relax_integrality(model);

julia> print(model)
Min x + y
Subject to
 x ≥ 0
 y ≥ 1
 x ≤ 1
 y ≤ 10

julia> undo_relax()

julia> print(model)
Min x + y
Subject to
 y ≥ 1
 y ≤ 10
 y integer
 x binary
source

relax_with_penalty!

JuMP.relax_with_penalty!Function
relax_with_penalty!(
    model::GenericModel{T},
    [penalties::Dict{ConstraintRef,T}];
    [default::Union{Nothing,Real} = nothing,]
) where {T}

Destructively modify the model in-place to create a penalized relaxation of the constraints.

Warning

This is a destructive routine that modifies the model in-place. If you don't want to modify the original model, use copy_model to create a copy before calling relax_with_penalty!.

Reformulation

See MOI.Utilities.ScalarPenaltyRelaxation for details of the reformulation.

For each constraint ci, the penalty passed to MOI.Utilities.ScalarPenaltyRelaxation is get(penalties, ci, default). If the value is nothing, because ci does not exist in penalties and default = nothing, then the constraint is skipped.

Return value

This function returns a Dict{ConstraintRef,AffExpr} that maps each constraint index to the corresponding y + z as an AffExpr. In an optimal solution, query the value of these functions to compute the violation of each constraint.

Relax a subset of constraints

To relax a subset of constraints, pass a penalties dictionary and set default = nothing.

Example

julia> function new_model()
           model = Model()
           @variable(model, x)
           @objective(model, Max, 2x + 1)
           @constraint(model, c1, 2x - 1 <= -2)
           @constraint(model, c2, 3x >= 0)
           return model
       end
new_model (generic function with 1 method)

julia> model_1 = new_model();

julia> penalty_map = relax_with_penalty!(model_1; default = 2.0);

julia> penalty_map[model_1[:c1]]
_[3]

julia> penalty_map[model_1[:c2]]
_[2]

julia> print(model_1)
Max 2 x - 2 _[2] - 2 _[3] + 1
Subject to
 c2 : 3 x + _[2] ≥ 0
 c1 : 2 x - _[3] ≤ -1
 _[2] ≥ 0
 _[3] ≥ 0

julia> model_2 = new_model();

julia> relax_with_penalty!(model_2, Dict(model_2[:c2] => 3.0))
Dict{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.GreaterThan{Float64}}, ScalarShape}, AffExpr} with 1 entry:
  c2 : 3 x + _[2] ≥ 0 => _[2]

julia> print(model_2)
Max 2 x - 3 _[2] + 1
Subject to
 c2 : 3 x + _[2] ≥ 0
 c1 : 2 x ≤ -1
 _[2] ≥ 0
source

remove_bridge

JuMP.remove_bridgeFunction
remove_bridge(
    model::GenericModel{S},
    BT::Type{<:MOI.Bridges.AbstractBridge};
    coefficient_type::Type{T} = S,
) where {S,T}

Remove BT{T} from the list of bridges that can be used to transform unsupported constraints into an equivalent formulation using only constraints supported by the optimizer.

See also: add_bridge.

Example

julia> model = Model();

julia> add_bridge(model, MOI.Bridges.Constraint.SOCtoNonConvexQuadBridge)

julia> remove_bridge(model, MOI.Bridges.Constraint.SOCtoNonConvexQuadBridge)

julia> add_bridge(
           model,
           MOI.Bridges.Constraint.NumberConversionBridge;
           coefficient_type = Complex{Float64},
       )

julia> remove_bridge(
           model,
           MOI.Bridges.Constraint.NumberConversionBridge;
           coefficient_type = Complex{Float64},
       )
source

reshape_set

JuMP.reshape_setFunction
reshape_set(vectorized_set::MOI.AbstractSet, shape::AbstractShape)

Return a set in its original shape shape given its vectorized form vectorized_form.

Example

Given a SymmetricMatrixShape of vectorized form [1, 2, 3] in MOI.PositiveSemidefinieConeTriangle(2), the following code returns the set of the original constraint Symmetric(Matrix[1 2; 2 3]) in PSDCone():

julia> reshape_set(MOI.PositiveSemidefiniteConeTriangle(2), SymmetricMatrixShape(2))
PSDCone()
source

reshape_vector

JuMP.reshape_vectorFunction
reshape_vector(vectorized_form::Vector, shape::AbstractShape)

Return an object in its original shape shape given its vectorized form vectorized_form.

Example

Given a SymmetricMatrixShape of vectorized form [1, 2, 3], the following code returns the matrix Symmetric(Matrix[1 2; 2 3]):

julia> reshape_vector([1, 2, 3], SymmetricMatrixShape(2))
2×2 LinearAlgebra.Symmetric{Int64, Matrix{Int64}}:
 1  2
 2  3
source

result_count

JuMP.result_countFunction
result_count(model::GenericModel)

Return the number of results available to query after a call to optimize!.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> result_count(model)
0
source

reverse_sense

JuMP.reverse_senseFunction
reverse_sense(::Val{T}) where {T}

Given an (in)equality symbol T, return a new Val object with the opposite (in)equality symbol.

This function is intended for use in JuMP extensions.

Example

julia> reverse_sense(Val(:>=))
Val{:<=}()
source

set_attribute

JuMP.set_attributeFunction
set_attribute(model::GenericModel, attr::MOI.AbstractModelAttribute, value)
set_attribute(x::GenericVariableRef, attr::MOI.AbstractVariableAttribute, value)
set_attribute(cr::ConstraintRef, attr::MOI.AbstractConstraintAttribute, value)

Set the value of a solver-specifc attribute attr to value.

This is equivalent to calling MOI.set with the associated MOI model and, for variables and constraints, with the associated MOI.VariableIndex or MOI.ConstraintIndex.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @constraint(model, c, 2 * x <= 1)
c : 2 x ≤ 1

julia> set_attribute(model, MOI.Name(), "model_new")

julia> set_attribute(x, MOI.VariableName(), "x_new")

julia> set_attribute(c, MOI.ConstraintName(), "c_new")
source
set_attribute(
    model::Union{GenericModel,MOI.OptimizerWithAttributes},
    attr::Union{AbstractString,MOI.AbstractOptimizerAttribute},
    value,
)

Set the value of a solver-specifc attribute attr to value.

This is equivalent to calling MOI.set with the associated MOI model.

If attr is an AbstractString, it is converted to MOI.RawOptimizerAttribute.

Example

julia> import HiGHS

julia> opt = optimizer_with_attributes(HiGHS.Optimizer, "output_flag" => false);

julia> model = Model(opt);

julia> set_attribute(model, "output_flag", false)

julia> set_attribute(model, MOI.RawOptimizerAttribute("output_flag"), true)

julia> set_attribute(opt, "output_flag", true)

julia> set_attribute(opt, MOI.RawOptimizerAttribute("output_flag"), false)
source

set_attributes

JuMP.set_attributesFunction
set_attributes(
    destination::Union{
        GenericModel,
        MOI.OptimizerWithAttributes,
        GenericVariableRef,
        ConstraintRef,
    },
    pairs::Pair...,
)

Given a list of attribute => value pairs, calls set_attribute(destination, attribute, value) for each pair.

See also: set_attribute, get_attribute.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> set_attributes(model, "tol" => 1e-4, "max_iter" => 100)

is equivalent to:

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> set_attribute(model, "tol", 1e-4)

julia> set_attribute(model, "max_iter", 100)
source

set_binary

JuMP.set_binaryFunction
set_binary(v::GenericVariableRef)

Add a constraint on the variable v that it must take values in the set $\{0,1\}$.

See also BinaryRef, is_binary, unset_binary.

Example

julia> model = Model();

julia> @variable(model, x);

julia> is_binary(x)
false

julia> set_binary(x)

julia> is_binary(x)
true
source

set_dual_start_value

JuMP.set_dual_start_valueFunction
set_dual_start_value(con_ref::ConstraintRef, value)

Set the dual start value (MOI attribute ConstraintDualStart) of the constraint con_ref to value.

To remove a dual start value set it to nothing.

See also dual_start_value.

Example

julia> model = Model();

julia> @variable(model, x, start = 2.0);

julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ Nonnegatives()

julia> set_dual_start_value(c, [0.0])

julia> dual_start_value(c)
1-element Vector{Float64}:
 0.0

julia> set_dual_start_value(c, nothing)

julia> dual_start_value(c)
source

set_integer

JuMP.set_integerFunction
set_integer(variable_ref::GenericVariableRef)

Add an integrality constraint on the variable variable_ref.

See also IntegerRef, is_integer, unset_integer.

Example

julia> model = Model();

julia> @variable(model, x);

julia> is_integer(x)
false

julia> set_integer(x)

julia> is_integer(x)
true
source

set_lower_bound

set_name

JuMP.set_nameFunction
set_name(con_ref::ConstraintRef, s::AbstractString)

Set a constraint's name attribute.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ Nonnegatives()

julia> set_name(c, "my_constraint")

julia> name(c)
"my_constraint"

julia> c
my_constraint : [2 x] ∈ Nonnegatives()
source
set_name(v::GenericVariableRef, s::AbstractString)

Set a variable's name attribute.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> set_name(x, "x_foo")

julia> x
x_foo

julia> name(x)
"x_foo"
source

set_normalized_coefficient

JuMP.set_normalized_coefficientFunction
set_normalized_coefficient(
    constraint::ConstraintRef,
    variable::GenericVariableRef,
    value::Number,
)

Set the coefficient of variable in the constraint constraint to value.

Note that prior to this step, JuMP will aggregate multiple terms containing the same variable. For example, given a constraint 2x + 3x <= 2, set_normalized_coefficient(con, x, 4) will create the constraint 4x <= 2.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @constraint(model, con, 2x + 3x <= 2)
con : 5 x ≤ 2

julia> set_normalized_coefficient(con, x, 4)

julia> con
con : 4 x ≤ 2
source
set_normalized_coefficient(
    constraints::AbstractVector{<:ConstraintRef},
    variables::AbstractVector{<:GenericVariableRef},
    values::AbstractVector{<:Number},
)

Set multiple coefficient of variables in the constraints constraints to values.

Note that prior to this step, JuMP will aggregate multiple terms containing the same variable. For example, given a constraint 2x + 3x <= 2, set_normalized_coefficient(con, [x], [4]) will create the constraint 4x <= 2.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @variable(model, y)
y

julia> @constraint(model, con, 2x + 3x + 4y <= 2)
con : 5 x + 4 y ≤ 2

julia> set_normalized_coefficient([con, con], [x, y], [6, 7])

julia> con
con : 6 x + 7 y ≤ 2
source
set_normalized_coefficient(
    con_ref::ConstraintRef,
    variable::AbstractVariableRef,
    new_coefficients::Vector{Tuple{Int64,T}},
)

Set the coefficients of variable in the constraint con_ref to new_coefficients, where each element in new_coefficients is a tuple which maps the row to a new coefficient.

Note that prior to this step, during constraint creation, JuMP will aggregate multiple terms containing the same variable.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @constraint(model, con, [2x + 3x, 4x] in MOI.Nonnegatives(2))
con : [5 x, 4 x] ∈ MathOptInterface.Nonnegatives(2)

julia> set_normalized_coefficient(con, x, [(1, 2.0), (2, 5.0)])

julia> con
con : [2 x, 5 x] ∈ MathOptInterface.Nonnegatives(2)
source
set_normalized_coefficient(
    constraint::ConstraintRef,
    variable_1:GenericVariableRef,
    variable_2:GenericVariableRef,
    value::Number,
)

Set the quadratic coefficient associated with variable_1 and variable_2 in the constraint constraint to value.

Note that prior to this step, JuMP will aggregate multiple terms containing the same variable. For example, given a constraint 2x^2 + 3x^2 <= 2, set_normalized_coefficient(con, x, x, 4) will create the constraint 4x^2 <= 2.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> @constraint(model, con, 2x[1]^2 + 3 * x[1] * x[2] + x[2] <= 2)
con : 2 x[1]² + 3 x[1]*x[2] + x[2] ≤ 2

julia> set_normalized_coefficient(con, x[1], x[1], 4)

julia> set_normalized_coefficient(con, x[1], x[2], 5)

julia> con
con : 4 x[1]² + 5 x[1]*x[2] + x[2] ≤ 2
source
set_normalized_coefficient(
    constraints::AbstractVector{<:ConstraintRef},
    variables_1:AbstractVector{<:GenericVariableRef},
    variables_2:AbstractVector{<:GenericVariableRef},
    values::AbstractVector{<:Number},
)

Set multiple quadratic coefficients associated with variables_1 and variables_2 in the constraints constraints to values.

Note that prior to this step, JuMP will aggregate multiple terms containing the same variable. For example, given a constraint 2x^2 + 3x^2 <= 2, set_normalized_coefficient(con, [x], [x], [4]) will create the constraint 4x^2 <= 2.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> @constraint(model, con, 2x[1]^2 + 3 * x[1] * x[2] + x[2] <= 2)
con : 2 x[1]² + 3 x[1]*x[2] + x[2] ≤ 2

julia> set_normalized_coefficient([con, con], [x[1], x[1]], [x[1], x[2]], [4, 5])

julia> con
con : 4 x[1]² + 5 x[1]*x[2] + x[2] ≤ 2
source

set_normalized_rhs

JuMP.set_normalized_rhsFunction
set_normalized_rhs(constraint::ConstraintRef, value::Number)

Set the right-hand side term of constraint to value.

Note that prior to this step, JuMP will aggregate all constant terms onto the right-hand side of the constraint. For example, given a constraint 2x + 1 <= 2, set_normalized_rhs(con, 4) will create the constraint 2x <= 4, not 2x + 1 <= 4.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, con, 2x + 1 <= 2)
con : 2 x ≤ 1

julia> set_normalized_rhs(con, 4)

julia> con
con : 2 x ≤ 4
source
set_normalized_rhs(
    constraints::AbstractVector{<:ConstraintRef},
    values::AbstractVector{<:Number}
)

Set the right-hand side terms of all constraints to values.

Note that prior to this step, JuMP will aggregate all constant terms onto the right-hand side of the constraint. For example, given a constraint 2x + 1 <= 2, set_normalized_rhs([con], [4]) will create the constraint 2x <= 4, not 2x + 1 <= 4.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, con1, 2x + 1 <= 2)
con1 : 2 x ≤ 1

julia> @constraint(model, con2, 3x + 2 <= 4)
con2 : 3 x ≤ 2

julia> set_normalized_rhs([con1, con2], [4, 5])

julia> con1
con1 : 2 x ≤ 4

julia> con2
con2 : 3 x ≤ 5
source

set_objective

JuMP.set_objectiveFunction
set_objective(model::AbstractModel, sense::MOI.OptimizationSense, func)

The functional equivalent of the @objective macro.

Sets the objective sense and objective function simultaneously, and is equivalent to calling set_objective_sense and set_objective_function separately.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> set_objective(model, MIN_SENSE, x)
source

set_objective_coefficient

JuMP.set_objective_coefficientFunction
set_objective_coefficient(
    model::GenericModel,
    variable::GenericVariableRef,
    coefficient::Real,
)

Set the linear objective coefficient associated with variable to coefficient.

Note: this function will throw an error if a nonlinear objective is set.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @objective(model, Min, 2x + 1)
2 x + 1

julia> set_objective_coefficient(model, x, 3)

julia> objective_function(model)
3 x + 1
source
set_objective_coefficient(
    model::GenericModel,
    variables::Vector{<:GenericVariableRef},
    coefficients::Vector{<:Real},
)

Set multiple linear objective coefficients associated with variables to coefficients, in a single call.

Note: this function will throw an error if a nonlinear objective is set.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @variable(model, y);

julia> @objective(model, Min, 3x + 2y + 1)
3 x + 2 y + 1

julia> set_objective_coefficient(model, [x, y], [5, 4])

julia> objective_function(model)
5 x + 4 y + 1
source
set_objective_coefficient(
    model::GenericModel{T},
    variable_1::GenericVariableRef{T},
    variable_2::GenericVariableRef{T},
    coefficient::Real,
) where {T}

Set the quadratic objective coefficient associated with variable_1 and variable_2 to coefficient.

Note: this function will throw an error if a nonlinear objective is set.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> @objective(model, Min, x[1]^2 + x[1] * x[2])
x[1]² + x[1]*x[2]

julia> set_objective_coefficient(model, x[1], x[1], 2)

julia> set_objective_coefficient(model, x[1], x[2], 3)

julia> objective_function(model)
2 x[1]² + 3 x[1]*x[2]
source
set_objective_coefficient(
    model::GenericModel{T},
    variables_1::AbstractVector{<:GenericVariableRef{T}},
    variables_2::AbstractVector{<:GenericVariableRef{T}},
    coefficients::AbstractVector{<:Real},
) where {T}

Set multiple quadratic objective coefficients associated with variables_1 and variables_2 to coefficients, in a single call.

Note: this function will throw an error if a nonlinear objective is set.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> @objective(model, Min, x[1]^2 + x[1] * x[2])
x[1]² + x[1]*x[2]

julia> set_objective_coefficient(model, [x[1], x[1]], [x[1], x[2]], [2, 3])

julia> objective_function(model)
2 x[1]² + 3 x[1]*x[2]
source

set_objective_function

JuMP.set_objective_functionFunction
set_objective_function(model::GenericModel, func::MOI.AbstractFunction)
set_objective_function(model::GenericModel, func::AbstractJuMPScalar)
set_objective_function(model::GenericModel, func::Real)
set_objective_function(model::GenericModel, func::Vector{<:AbstractJuMPScalar})

Sets the objective function of the model to the given function.

See set_objective_sense to set the objective sense.

These are low-level functions; the recommended way to set the objective is with the @objective macro.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @objective(model, Min, x);

julia> objective_function(model)
x

julia> set_objective_function(model, 2 * x + 1)

julia> objective_function(model)
2 x + 1
source

set_objective_sense

JuMP.set_objective_senseFunction
set_objective_sense(model::GenericModel, sense::MOI.OptimizationSense)

Sets the objective sense of the model to the given sense.

See set_objective_function to set the objective function.

These are low-level functions; the recommended way to set the objective is with the @objective macro.

Example

julia> model = Model();

julia> objective_sense(model)
FEASIBILITY_SENSE::OptimizationSense = 2

julia> set_objective_sense(model, MOI.MAX_SENSE)

julia> objective_sense(model)
MAX_SENSE::OptimizationSense = 1
source

set_optimize_hook

JuMP.set_optimize_hookFunction
set_optimize_hook(model::GenericModel, f::Union{Function,Nothing})

Set the function f as the optimize hook for model.

f should have a signature f(model::GenericModel; kwargs...), where the kwargs are those passed to optimize!.

Notes

  • The optimize hook should generally modify the model, or some external state in some way, and then call optimize!(model; ignore_optimize_hook = true) to optimize the problem, bypassing the hook.
  • Use set_optimize_hook(model, nothing) to unset an optimize hook.

Example

julia> model = Model();

julia> function my_hook(model::Model; kwargs...)
           println(kwargs)
           println("Calling with `ignore_optimize_hook = true`")
           optimize!(model; ignore_optimize_hook = true)
           return
       end
my_hook (generic function with 1 method)

julia> set_optimize_hook(model, my_hook)
my_hook (generic function with 1 method)

julia> optimize!(model; test_arg = true)
Base.Pairs{Symbol, Bool, Tuple{Symbol}, @NamedTuple{test_arg::Bool}}(:test_arg => 1)
Calling with `ignore_optimize_hook = true`
ERROR: NoOptimizer()
[...]
source

set_optimizer

JuMP.set_optimizerFunction
set_optimizer(
    model::GenericModel,
    optimizer_factory;
    add_bridges::Bool = true,
)

Creates an empty MathOptInterface.AbstractOptimizer instance by calling optimizer_factory() and sets it as the optimizer of model. Specifically, optimizer_factory must be callable with zero arguments and return an empty MathOptInterface.AbstractOptimizer.

If add_bridges is true, constraints and objectives that are not supported by the optimizer are automatically bridged to equivalent supported formulation. Passing add_bridges = false can improve performance if the solver natively supports all of the elements in model.

See set_attribute for setting solver-specific parameters of the optimizer.

Example

julia> import HiGHS

julia> model = Model();

julia> set_optimizer(model, () -> HiGHS.Optimizer())

julia> set_optimizer(model, HiGHS.Optimizer; add_bridges = false)
source

set_parameter_value

JuMP.set_parameter_valueFunction
set_parameter_value(x::GenericVariableRef, value)

Update the parameter constraint on the variable x to value.

Errors if x is not a parameter.

See also ParameterRef, is_parameter, parameter_value.

Example

julia> model = Model();

julia> @variable(model, p in Parameter(2))
p

julia> parameter_value(p)
2.0

julia> set_parameter_value(p, 2.5)

julia> parameter_value(p)
2.5
source

set_silent

JuMP.set_silentFunction
set_silent(model::GenericModel)

Takes precedence over any other attribute controlling verbosity and requires the solver to produce no output.

See also: unset_silent.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> set_silent(model)

julia> get_attribute(model, MOI.Silent())
true

julia> unset_silent(model)

julia> get_attribute(model, MOI.Silent())
false
source

set_start_value

JuMP.set_start_valueFunction
set_start_value(con_ref::ConstraintRef, value)

Set the primal start value (MOI.ConstraintPrimalStart) of the constraint con_ref to value.

To remove a primal start value set it to nothing.

See also start_value.

Example

julia> model = Model();

julia> @variable(model, x, start = 2.0);

julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ Nonnegatives()

julia> set_start_value(c, [4.0])

julia> start_value(c)
1-element Vector{Float64}:
 4.0

julia> set_start_value(c, nothing)

julia> start_value(c)
source
set_start_value(variable::GenericVariableRef, value::Union{Real,Nothing})

Set the start value (MOI.VariablePrimalStart) of the variable to value.

Pass nothing to unset the start value.

Note: VariablePrimalStarts are sometimes called "MIP-starts" or "warmstarts".

See also: has_start_value, start_value.

Example

julia> model = Model();

julia> @variable(model, x, start = 1.5);

julia> @variable(model, y);

julia> has_start_value(x)
true

julia> has_start_value(y)
false

julia> start_value(x)
1.5

julia> set_start_value(x, nothing)

julia> has_start_value(x)
false

julia> set_start_value(y, 2.0)

julia> has_start_value(y)
true

julia> start_value(y)
2.0
source

set_start_values

JuMP.set_start_valuesFunction
set_start_values(
    model::GenericModel;
    variable_primal_start::Union{Nothing,Function} = value,
    constraint_primal_start::Union{Nothing,Function} = value,
    constraint_dual_start::Union{Nothing,Function} = dual,
    nonlinear_dual_start::Union{Nothing,Function} = nonlinear_dual_start_value,
)

Set the primal and dual starting values in model using the functions provided.

If any keyword argument is nothing, the corresponding start value is skipped.

If the optimizer does not support setting the starting value, the value will be skipped.

variable_primal_start

This function controls the primal starting solution for the variables. It is equivalent to calling set_start_value for each variable, or setting the MOI.VariablePrimalStart attribute.

If it is a function, it must have the form variable_primal_start(x::VariableRef) that maps each variable x to the starting primal value.

The default is value.

constraint_primal_start

This function controls the primal starting solution for the constraints. It is equivalent to calling set_start_value for each constraint, or setting the MOI.ConstraintPrimalStart attribute.

If it is a function, it must have the form constraint_primal_start(ci::ConstraintRef) that maps each constraint ci to the starting primal value.

The default is value.

constraint_dual_start

This function controls the dual starting solution for the constraints. It is equivalent to calling set_dual_start_value for each constraint, or setting the MOI.ConstraintDualStart attribute.

If it is a function, it must have the form constraint_dual_start(ci::ConstraintRef) that maps each constraint ci to the starting dual value.

The default is dual.

nonlinear_dual_start

This function controls the dual starting solution for the nonlinear constraints It is equivalent to calling set_nonlinear_dual_start_value.

If it is a function, it must have the form nonlinear_dual_start(model::GenericModel) that returns a vector corresponding to the dual start of the constraints.

The default is nonlinear_dual_start_value.

source

set_string_names_on_creation

JuMP.set_string_names_on_creationFunction
set_string_names_on_creation(model::GenericModel, value::Bool)

Set the default argument of the set_string_name keyword in the @variable and @constraint macros to value.

The set_string_name keyword is used to determine whether to assign String names to all variables and constraints in model.

By default, value is true. However, for larger models calling set_string_names_on_creation(model, false) can improve performance at the cost of reducing the readability of printing and solver log messages.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_string_names_on_creation(model)
true

julia> set_string_names_on_creation(model, false)

julia> set_string_names_on_creation(model)
false
source

set_time_limit_sec

JuMP.set_time_limit_secFunction
set_time_limit_sec(model::GenericModel, limit::Float64)

Set the time limit (in seconds) of the solver.

Can be unset using unset_time_limit_sec or with limit set to nothing.

See also: unset_time_limit_sec, time_limit_sec.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> time_limit_sec(model)

julia> set_time_limit_sec(model, 60.0)

julia> time_limit_sec(model)
60.0

julia> unset_time_limit_sec(model)

julia> time_limit_sec(model)
source

set_upper_bound

shadow_price

JuMP.shadow_priceFunction
shadow_price(con_ref::ConstraintRef)

Return the change in the objective from an infinitesimal relaxation of the constraint.

The shadow price is computed from dual and can be queried only when has_duals is true and the objective sense is MIN_SENSE or MAX_SENSE (not FEASIBILITY_SENSE).

See also reduced_cost.

Comparison to dual

The shadow prices differ at most in sign from the dual value depending on the objective sense. The differences are summarized in the table:

MinMax
f(x) <= b+1-1
f(x) >= b-1+1

Notes

  • The function simply translates signs from dual and does not validate the conditions needed to guarantee the sensitivity interpretation of the shadow price. The caller is responsible, for example, for checking whether the solver converged to an optimal primal-dual pair or a proof of infeasibility.
  • The computation is based on the current objective sense of the model. If this has changed since the last solve, the results will be incorrect.
  • Relaxation of equality constraints (and hence the shadow price) is defined based on which sense of the equality constraint is active.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x);

julia> @constraint(model, c, x <= 1)
c : x ≤ 1

julia> @objective(model, Max, 2 * x + 1);

julia> optimize!(model)

julia> has_duals(model)
true

julia> shadow_price(c)
2.0
source

shape

JuMP.shapeFunction
shape(c::AbstractConstraint)::AbstractShape

Return the shape of the constraint c.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> c = @constraint(model, x[2] <= 1);

julia> shape(constraint_object(c))
ScalarShape()

julia> d = @constraint(model, x in SOS1());

julia> shape(constraint_object(d))
VectorShape()
source

show_backend_summary

JuMP.show_backend_summaryFunction
show_backend_summary(io::IO, model::GenericModel)

Print a summary of the optimizer backing model.

Extensions

AbstractModels should implement this method.

Example

julia> model = Model();

julia> show_backend_summary(stdout, model)
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
source

show_constraints_summary

JuMP.show_constraints_summaryFunction
show_constraints_summary(io::IO, model::AbstractModel)

Write to io a summary of the number of constraints.

Extensions

AbstractModels should implement this method.

Example

julia> model = Model();

julia> @variable(model, x >= 0);

julia> show_constraints_summary(stdout, model)
`VariableRef`-in-`MathOptInterface.GreaterThan{Float64}`: 1 constraint
source

show_objective_function_summary

JuMP.show_objective_function_summaryFunction
show_objective_function_summary(io::IO, model::AbstractModel)

Write to io a summary of the objective function type.

Extensions

AbstractModels should implement this method.

Example

julia> model = Model();

julia> show_objective_function_summary(stdout, model)
Objective function type: AffExpr
source

simplex_iterations

JuMP.simplex_iterationsFunction
simplex_iterations(model::GenericModel)

If available, returns the cumulative number of simplex iterations during the most-recent optimization (the MOI.SimplexIterations attribute).

Throws a MOI.GetAttributeNotAllowed error if the attribute is not implemented by the solver.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> optimize!(model)

julia> simplex_iterations(model)
0
source

solution_summary

JuMP.solution_summaryFunction
solution_summary(model::GenericModel; result::Int = 1, verbose::Bool = false)

Return a struct that can be used print a summary of the solution in result result.

If verbose=true, write out the primal solution for every variable and the dual solution for every constraint, excluding those with empty names.

Example

When called at the REPL, the summary is automatically printed:

julia> model = Model();

julia> solution_summary(model)
* Solver : No optimizer attached.

* Status
  Result count       : 0
  Termination status : OPTIMIZE_NOT_CALLED
  Message from the solver:
  "optimize not called"

* Candidate solution (result #1)
  Primal status      : NO_SOLUTION
  Dual status        : NO_SOLUTION

* Work counters

Use print to force the printing of the summary from inside a function:

julia> model = Model();

julia> function foo(model)
           print(solution_summary(model))
           return
       end
foo (generic function with 1 method)

julia> foo(model)
* Solver : No optimizer attached.

* Status
  Result count       : 0
  Termination status : OPTIMIZE_NOT_CALLED
  Message from the solver:
  "optimize not called"

* Candidate solution (result #1)
  Primal status      : NO_SOLUTION
  Dual status        : NO_SOLUTION

* Work counters
source

solve_time

JuMP.solve_timeFunction
solve_time(model::GenericModel)

If available, returns the solve time in wall-clock seconds reported by the solver (the MOI.SolveTimeSec attribute).

Throws a MOI.GetAttributeNotAllowed error if the attribute is not implemented by the solver.

Example

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> optimize!(model)

julia> solve_time(model)
1.0488089174032211e-5
source

solver_name

JuMP.solver_nameFunction
solver_name(model::GenericModel)

If available, returns the MOI.SolverName property of the underlying optimizer.

Returns "No optimizer attached." in AUTOMATIC or MANUAL modes when no optimizer is attached.

Returns "SolverName() attribute not implemented by the optimizer." if the attribute is not implemented.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> solver_name(model)
"Ipopt"

julia> model = Model();

julia> solver_name(model)
"No optimizer attached."

julia> model = Model(MOI.FileFormats.MPS.Model);

julia> solver_name(model)
"SolverName() attribute not implemented by the optimizer."
source

start_value

JuMP.start_valueFunction
start_value(con_ref::ConstraintRef)

Return the primal start value (MOI.ConstraintPrimalStart) of the constraint con_ref.

If no primal start value has been set, start_value will return nothing.

See also set_start_value.

Example

julia> model = Model();

julia> @variable(model, x, start = 2.0);

julia> @constraint(model, c, [2x] in Nonnegatives())
c : [2 x] ∈ Nonnegatives()

julia> set_start_value(c, [4.0])

julia> start_value(c)
1-element Vector{Float64}:
 4.0

julia> set_start_value(c, nothing)

julia> start_value(c)
source
start_value(v::GenericVariableRef)

Return the start value (MOI.VariablePrimalStart) of the variable v.

Note: VariablePrimalStarts are sometimes called "MIP-starts" or "warmstarts".

See also: has_start_value, set_start_value.

Example

julia> model = Model();

julia> @variable(model, x, start = 1.5);

julia> @variable(model, y);

julia> has_start_value(x)
true

julia> has_start_value(y)
false

julia> start_value(x)
1.5

julia> set_start_value(y, 2.0)

julia> has_start_value(y)
true

julia> start_value(y)
2.0
source

termination_status

time_limit_sec

JuMP.time_limit_secFunction
time_limit_sec(model::GenericModel)

Return the time limit (in seconds) of the model.

Returns nothing if unset.

See also: set_time_limit_sec, unset_time_limit_sec.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> time_limit_sec(model)

julia> set_time_limit_sec(model, 60.0)

julia> time_limit_sec(model)
60.0

julia> unset_time_limit_sec(model)

julia> time_limit_sec(model)
source

triangle_vec

JuMP.triangle_vecFunction
triangle_vec(matrix::Matrix)

Return the upper triangle of a matrix concatenated into a vector in the order required by JuMP and MathOptInterface for Triangle sets.

Example

julia> model = Model();

julia> @variable(model, X[1:3, 1:3], Symmetric);

julia> @variable(model, t)
t

julia> @constraint(model, [t; triangle_vec(X)] in MOI.RootDetConeTriangle(3))
[t, X[1,1], X[1,2], X[2,2], X[1,3], X[2,3], X[3,3]] ∈ MathOptInterface.RootDetConeTriangle(3)
source

unfix

JuMP.unfixFunction
unfix(v::GenericVariableRef)

Delete the fixing constraint of a variable.

Error if one does not exist.

See also FixRef, is_fixed, fix_value, fix.

Example

julia> model = Model();

julia> @variable(model, x == 1);

julia> is_fixed(x)
true

julia> unfix(x)

julia> is_fixed(x)
false
source

unregister

JuMP.unregisterFunction
unregister(model::GenericModel, key::Symbol)

Unregister the name key from model so that a new variable, constraint, or expression can be created with the same key.

Note that this will not delete the object model[key]; it will just remove the reference at model[key]. To delete the object, use delete as well.

See also: delete, object_dictionary.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @variable(model, x)
ERROR: An object of name x is already attached to this model. If this
    is intended, consider using the anonymous construction syntax, for example,
    `x = @variable(model, [1:N], ...)` where the name of the object does
    not appear inside the macro.

    Alternatively, use `unregister(model, :x)` to first unregister
    the existing name from the model. Note that this will not delete the
    object; it will just remove the reference at `model[:x]`.

Stacktrace:
[...]

julia> num_variables(model)
1

julia> unregister(model, :x)

julia> @variable(model, x)
x

julia> num_variables(model)
2
source

unsafe_backend

JuMP.unsafe_backendFunction
unsafe_backend(model::GenericModel)

Return the innermost optimizer associated with the JuMP model model.

This function should only be used by advanced users looking to access low-level solver-specific functionality. It has a high-risk of incorrect usage. We strongly suggest you use the alternative suggested below.

See also: backend.

To obtain the index of a variable or constraint in the unsafe backend, use optimizer_index.

Unsafe behavior

This function is unsafe for two main reasons.

First, the formulation and order of variables and constraints in the unsafe backend may be different to the variables and constraints in model. This can happen because of bridges, or because the solver requires the variables or constraints in a specific order. In addition, the variable or constraint index returned by index at the JuMP level may be different to the index of the corresponding variable or constraint in the unsafe_backend. There is no solution to this. Use the alternative suggested below instead.

Second, the unsafe_backend may be empty, or lack some modifications made to the JuMP model. Thus, before calling unsafe_backend you should first call MOI.Utilities.attach_optimizer to ensure that the backend is synchronized with the JuMP model.

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer)
A JuMP Model
├ solver: HiGHS
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none

julia> MOI.Utilities.attach_optimizer(model)

julia> inner = unsafe_backend(model)
A HiGHS model with 0 columns and 0 rows.

Moreover, if you modify the JuMP model, the reference you have to the backend (that is, inner in the example above) may be out-dated, and you should call MOI.Utilities.attach_optimizer again.

This function is also unsafe in the reverse direction: if you modify the unsafe backend, for example, by adding a new constraint to inner, the changes may be silently discarded by JuMP when the JuMP model is modified or solved.

Alternative

Instead of unsafe_backend, create a model using direct_model and call backend instead.

For example, instead of:

julia> import HiGHS

julia> model = Model(HiGHS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x >= 0)
x

julia> MOI.Utilities.attach_optimizer(model)

julia> highs = unsafe_backend(model)
A HiGHS model with 1 columns and 0 rows.

julia> optimizer_index(x)
MOI.VariableIndex(1)

Use:

julia> import HiGHS

julia> model = direct_model(HiGHS.Optimizer());

julia> set_silent(model)

julia> @variable(model, x >= 0)
x

julia> highs = backend(model)  # No need to call `attach_optimizer`.
A HiGHS model with 1 columns and 0 rows.

julia> index(x)
MOI.VariableIndex(1)
source

unset_binary

JuMP.unset_binaryFunction
unset_binary(variable_ref::GenericVariableRef)

Remove the binary constraint on the variable variable_ref.

See also BinaryRef, is_binary, set_binary.

Example

julia> model = Model();

julia> @variable(model, x, Bin);

julia> is_binary(x)
true

julia> unset_binary(x)

julia> is_binary(x)
false
source

unset_integer

JuMP.unset_integerFunction
unset_integer(variable_ref::GenericVariableRef)

Remove the integrality constraint on the variable variable_ref.

Errors if one does not exist.

See also IntegerRef, is_integer, set_integer.

Example

julia> model = Model();

julia> @variable(model, x, Int);

julia> is_integer(x)
true

julia> unset_integer(x)

julia> is_integer(x)
false
source

unset_silent

JuMP.unset_silentFunction
unset_silent(model::GenericModel)

Neutralize the effect of the set_silent function and let the solver attributes control the verbosity.

See also: set_silent.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> set_silent(model)

julia> get_attribute(model, MOI.Silent())
true

julia> unset_silent(model)

julia> get_attribute(model, MOI.Silent())
false
source

unset_time_limit_sec

JuMP.unset_time_limit_secFunction
unset_time_limit_sec(model::GenericModel)

Unset the time limit of the solver.

See also: set_time_limit_sec, time_limit_sec.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> time_limit_sec(model)

julia> set_time_limit_sec(model, 60.0)

julia> time_limit_sec(model)
60.0

julia> unset_time_limit_sec(model)

julia> time_limit_sec(model)
source

upper_bound

value

JuMP.valueFunction
value(con_ref::ConstraintRef; result::Int = 1)

Return the primal value of constraint con_ref associated with result index result of the most-recent solution returned by the solver.

That is, if con_ref is the reference of a constraint func-in-set, it returns the value of func evaluated at the value of the variables (given by value(::GenericVariableRef)).

Use has_values to check if a result exists before asking for values.

See also: result_count.

Note

For scalar constraints, the constant is moved to the set so it is not taken into account in the primal value of the constraint. For instance, the constraint @constraint(model, 2x + 3y + 1 == 5) is transformed into 2x + 3y-in-MOI.EqualTo(4) so the value returned by this function is the evaluation of 2x + 3y.

source
value(var_value::Function, con_ref::ConstraintRef)

Evaluate the primal value of the constraint con_ref using var_value(v) as the value for each variable v.

source
value(v::GenericVariableRef; result = 1)

Return the value of variable v associated with result index result of the most-recent returned by the solver.

Use has_values to check if a result exists before asking for values.

See also: result_count.

source
value(var_value::Function, v::GenericVariableRef)

Evaluate the value of the variable v as var_value(v).

source
value(var_value::Function, ex::GenericAffExpr)

Evaluate ex using var_value(v) as the value for each variable v.

source
value(v::GenericAffExpr; result::Int = 1)

Return the value of the GenericAffExpr v associated with result index result of the most-recent solution returned by the solver.

See also: result_count.

source
value(var_value::Function, ex::GenericQuadExpr)

Evaluate ex using var_value(v) as the value for each variable v.

source
value(v::GenericQuadExpr; result::Int = 1)

Return the value of the GenericQuadExpr v associated with result index result of the most-recent solution returned by the solver.

Replaces getvalue for most use cases.

See also: result_count.

source
value(p::NonlinearParameter)

Return the current value stored in the nonlinear parameter p.

Example

julia> model = Model();

julia> @NLparameter(model, p == 10)
p == 10.0

julia> value(p)
10.0
source
value(ex::NonlinearExpression; result::Int = 1)

Return the value of the NonlinearExpression ex associated with result index result of the most-recent solution returned by the solver.

See also: result_count.

source
value(var_value::Function, ex::NonlinearExpression)

Evaluate ex using var_value(v) as the value for each variable v.

source
value(c::NonlinearConstraintRef; result::Int = 1)

Return the value of the NonlinearConstraintRef c associated with result index result of the most-recent solution returned by the solver.

See also: result_count.

source
value(var_value::Function, c::NonlinearConstraintRef)

Evaluate c using var_value(v) as the value for each variable v.

source

value_type

JuMP.value_typeFunction
value_type(::Type{<:Union{AbstractModel,AbstractVariableRef}})

Return the return type of value for variables of that model. It defaults to Float64 if it is not implemented.

Example

julia> value_type(GenericModel{BigFloat})
BigFloat
source

variable_by_name

JuMP.variable_by_nameFunction
variable_by_name(
    model::AbstractModel,
    name::String,
)::Union{AbstractVariableRef,Nothing}

Returns the reference of the variable with name attribute name or Nothing if no variable has this name attribute. Throws an error if several variables have name as their name attribute.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> variable_by_name(model, "x")
x

julia> @variable(model, base_name="x")
x

julia> variable_by_name(model, "x")
ERROR: Multiple variables have the name x.
Stacktrace:
 [1] error(::String) at ./error.jl:33
 [2] get(::MOIU.Model{Float64}, ::Type{MathOptInterface.VariableIndex}, ::String) at /home/blegat/.julia/dev/MathOptInterface/src/Utilities/model.jl:222
 [3] get at /home/blegat/.julia/dev/MathOptInterface/src/Utilities/universalfallback.jl:201 [inlined]
 [4] get(::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer,MathOptInterface.Utilities.UniversalFallback{MOIU.Model{Float64}}}, ::Type{MathOptInterface.VariableIndex}, ::String) at /home/blegat/.julia/dev/MathOptInterface/src/Utilities/cachingoptimizer.jl:490
 [5] variable_by_name(::GenericModel, ::String) at /home/blegat/.julia/dev/JuMP/src/variables.jl:268
 [6] top-level scope at none:0

julia> var = @variable(model, base_name="y")
y

julia> variable_by_name(model, "y")
y

julia> set_name(var, "z")

julia> variable_by_name(model, "y")

julia> variable_by_name(model, "z")
z

julia> @variable(model, u[1:2])
2-element Vector{VariableRef}:
 u[1]
 u[2]

julia> variable_by_name(model, "u[2]")
u[2]
source

variable_ref_type

JuMP.variable_ref_typeFunction
variable_ref_type(::Union{F,Type{F}}) where {F}

A helper function used internally by JuMP and some JuMP extensions. Returns the variable type associated with the model or expression type F.

source

vectorize

JuMP.vectorizeFunction
vectorize(matrix::AbstractMatrix, ::Shape)

Convert the matrix into a vector according to Shape.

source

write_to_file

JuMP.write_to_fileFunction
write_to_file(
    model::GenericModel,
    filename::String;
    format::MOI.FileFormats.FileFormat = MOI.FileFormats.FORMAT_AUTOMATIC,
    kwargs...,
)

Write the JuMP model model to filename in the format format.

If the filename ends in .gz, it will be compressed using GZip. If the filename ends in .bz2, it will be compressed using BZip2.

Other kwargs are passed to the Model constructor of the chosen format.

source

AbstractConstraint

JuMP.AbstractConstraintType
abstract type AbstractConstraint

An abstract base type for all constraint types. AbstractConstraints store the function and set directly, unlike ConstraintRefs that are merely references to constraints stored in a model. AbstractConstraints do not need to be attached to a model.

source

AbstractJuMPScalar

JuMP.AbstractJuMPScalarType
AbstractJuMPScalar <: MutableArithmetics.AbstractMutable

Abstract base type for all scalar types

The subtyping of AbstractMutable will allow calls of some Base functions to be redirected to a method in MA that handles type promotion more carefully (for example the promotion in sparse matrix products in SparseArrays usually does not work for JuMP types) and exploits the mutability of AffExpr and QuadExpr.

source

AbstractModel

JuMP.AbstractModelType
AbstractModel

An abstract type that should be subtyped for users creating JuMP extensions.

source

AbstractScalarSet

JuMP.AbstractScalarSetType
AbstractScalarSet

An abstract type for defining new scalar sets in JuMP.

Implement moi_set(::AbstractScalarSet) to convert the type into an MOI set.

See also: moi_set.

source

AbstractShape

AbstractVariable

AbstractVariableRef

JuMP.AbstractVariableRefType
AbstractVariableRef

Variable returned by add_variable. Affine (resp. quadratic) operations with variables of type V<:AbstractVariableRef and coefficients of type T create a GenericAffExpr{T,V} (resp. GenericQuadExpr{T,V}).

source

AbstractVectorSet

JuMP.AbstractVectorSetType
AbstractVectorSet

An abstract type for defining new sets in JuMP.

Implement moi_set(::AbstractVectorSet, dim::Int) to convert the type into an MOI set.

See also: moi_set.

source

AffExpr

ArrayShape

JuMP.ArrayShapeType
ArrayShape{N}(dims::NTuple{N,Int}) where {N}

An AbstractShape that represents array-valued constraints.

Example

julia> model = Model();

julia> @variable(model, x[1:2, 1:3]);

julia> c = @constraint(model, x >= 0, Nonnegatives())
[x[1,1]  x[1,2]  x[1,3]
 x[2,1]  x[2,2]  x[2,3]] ∈ Nonnegatives()

julia> shape(constraint_object(c))
ArrayShape{2}((2, 3))
source

BinaryRef

JuMP.BinaryRefFunction
BinaryRef(v::GenericVariableRef)

Return a constraint reference to the constraint constraining v to be binary. Errors if one does not exist.

See also is_binary, set_binary, unset_binary.

Example

julia> model = Model();

julia> @variable(model, x, Bin);

julia> BinaryRef(x)
x binary
source

BridgeableConstraint

JuMP.BridgeableConstraintType
BridgeableConstraint(
    constraint::C,
    bridge_type::B;
    coefficient_type::Type{T} = Float64,
) where {C<:AbstractConstraint,B<:Type{<:MOI.Bridges.AbstractBridge},T}

An AbstractConstraint representinng that constraint that can be bridged by the bridge of type bridge_type{coefficient_type}.

Adding a BridgeableConstraint to a model is equivalent to:

add_bridge(model, bridge_type; coefficient_type = coefficient_type)
add_constraint(model, constraint)

Example

Given a new scalar set type CustomSet with a bridge CustomBridge that can bridge F-in-CustomSet constraints, when the user does:

model = Model()
@variable(model, x)
@constraint(model, x + 1 in CustomSet())
optimize!(model)

with an optimizer that does not support F-in-CustomSet constraints, the constraint will not be bridged unless they first call add_bridge(model, CustomBridge).

In order to automatically add the CustomBridge to any model to which an F-in-CustomSet is added, add the following method:

function JuMP.build_constraint(
    error_fn::Function,
    func::AbstractJuMPScalar,
    set::CustomSet,
)
    constraint = ScalarConstraint(func, set)
    return BridgeableConstraint(constraint, CustomBridge)
end

Note

JuMP extensions should extend JuMP.build_constraint only if they also defined CustomSet, for three reasons:

  1. It is problematic if multiple extensions overload the same JuMP method.
  2. A missing method will not inform the users that they forgot to load the extension module defining the build_constraint method.
  3. Defining a method where neither the function nor any of the argument types are defined in the package is called type piracy and is discouraged in the Julia style guide.
source

ComplexPlane

JuMP.ComplexPlaneType
ComplexPlane

Complex plane object that can be used to create a complex variable in the @variable macro.

Example

Consider the following example:

julia> model = Model();

julia> @variable(model, x in ComplexPlane())
real(x) + imag(x) im

julia> all_variables(model)
2-element Vector{VariableRef}:
 real(x)
 imag(x)

We see in the output of the last command that two real variables were created. The Julia variable x binds to an affine expression in terms of these two variables that parametrize the complex plane.

source

ComplexVariable

ConstraintNotOwned

JuMP.ConstraintNotOwnedType
struct ConstraintNotOwned{C<:ConstraintRef} <: Exception
    constraint_ref::C
end

An error thrown when the constraint constraint_ref was used in a model different to owner_model(constraint_ref).

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, x >= 0)
c : x ≥ 0

julia> model_new = Model();

julia> MOI.get(model_new, MOI.ConstraintName(), c)
ERROR: ConstraintNotOwned{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.GreaterThan{Float64}}, ScalarShape}}(c : x ≥ 0)
Stacktrace:
[...]
source

ConstraintRef

FixRef

JuMP.FixRefFunction
FixRef(v::GenericVariableRef)

Return a constraint reference to the constraint fixing the value of v.

Errors if one does not exist.

See also is_fixed, fix_value, fix, unfix.

Example

julia> model = Model();

julia> @variable(model, x == 1);

julia> FixRef(x)
x = 1
source

GenericAffExpr

JuMP.GenericAffExprType
mutable struct GenericAffExpr{CoefType,VarType} <: AbstractJuMPScalar
    constant::CoefType
    terms::OrderedDict{VarType,CoefType}
end

An expression type representing an affine expression of the form: $\sum a_i x_i + c$.

Fields

  • .constant: the constant c in the expression.
  • .terms: an OrderedDict, with keys of VarType and values of CoefType describing the sparse vector a.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> expr = x[2] + 3.0 * x[1] + 4.0
x[2] + 3 x[1] + 4

julia> expr.constant
4.0

julia> expr.terms
OrderedCollections.OrderedDict{VariableRef, Float64} with 2 entries:
  x[2] => 1.0
  x[1] => 3.0
source

GenericModel

JuMP.GenericModelType
GenericModel{T}(
    [optimizer_factory;]
    add_bridges::Bool = true,
) where {T<:Real}

Create a new instance of a JuMP model.

If optimizer_factory is provided, the model is initialized with the optimizer returned by MOI.instantiate(optimizer_factory).

If optimizer_factory is not provided, use set_optimizer to set the optimizer before calling optimize!.

If add_bridges, JuMP adds a MOI.Bridges.LazyBridgeOptimizer to automatically reformulate the problem into a form supported by the optimizer.

Value type T

Passing a type other than Float64 as the value type T is an advanced operation. The value type must match that expected by the chosen optimizer. Consult the optimizers documentation for details.

If not documented, assume that the optimizer supports only Float64.

Choosing an unsupported value type will throw an MOI.UnsupportedConstraint or an MOI.UnsupportedAttribute error, the timing of which (during the model construction or during a call to optimize!) depends on how the solver is interfaced to JuMP.

Example

julia> model = GenericModel{BigFloat}();

julia> typeof(model)
GenericModel{BigFloat}
source

GenericNonlinearExpr

JuMP.GenericNonlinearExprType
GenericNonlinearExpr{V}(head::Symbol, args::Vector{Any})
GenericNonlinearExpr{V}(head::Symbol, args::Any...)

The scalar-valued nonlinear function head(args...), represented as a symbolic expression tree, with the call operator head and ordered arguments in args.

V is the type of AbstractVariableRef present in the expression, and is used to help dispatch JuMP extensions.

head

The head::Symbol must be an operator supported by the model.

The default list of supported univariate operators is given by:

and the default list of supported multivariate operators is given by:

Additional operators can be add using @operator.

See the full list of operators supported by a MOI.ModelLike by querying the MOI.ListOfSupportedNonlinearOperators attribute.

args

The vector args contains the arguments to the nonlinear function. If the operator is univariate, it must contain one element. Otherwise, it may contain multiple elements.

Given a subtype of AbstractVariableRef, V, for GenericNonlinearExpr{V}, each element must be one of the following:

where T<:Real and T == value_type(V).

Unsupported operators

If the optimizer does not support head, an MOI.UnsupportedNonlinearOperator error will be thrown.

There is no guarantee about when this error will be thrown; it may be thrown when the function is first added to the model, or it may be thrown when optimize! is called.

Example

To represent the function $f(x) = sin(x)^2$, do:

julia> model = Model();

julia> @variable(model, x)
x

julia> f = sin(x)^2
sin(x) ^ 2.0

julia> f = GenericNonlinearExpr{VariableRef}(
           :^,
           GenericNonlinearExpr{VariableRef}(:sin, x),
           2.0,
       )
sin(x) ^ 2.0
source

GenericQuadExpr

JuMP.GenericQuadExprType
mutable struct GenericQuadExpr{CoefType,VarType} <: AbstractJuMPScalar
    aff::GenericAffExpr{CoefType,VarType}
    terms::OrderedDict{UnorderedPair{VarType}, CoefType}
end

An expression type representing an quadratic expression of the form: $\sum q_{i,j} x_i x_j + \sum a_i x_i + c$.

Fields

  • .aff: an GenericAffExpr representing the affine portion of the expression.
  • .terms: an OrderedDict, with keys of UnorderedPair{VarType} and values of CoefType, describing the sparse list of terms q.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> expr = 2.0 * x[1]^2 + x[1] * x[2] + 3.0 * x[1] + 4.0
2 x[1]² + x[1]*x[2] + 3 x[1] + 4

julia> expr.aff
3 x[1] + 4

julia> expr.terms
OrderedCollections.OrderedDict{UnorderedPair{VariableRef}, Float64} with 2 entries:
  UnorderedPair{VariableRef}(x[1], x[1]) => 2.0
  UnorderedPair{VariableRef}(x[1], x[2]) => 1.0
source

GenericReferenceMap

JuMP.GenericReferenceMapType
GenericReferenceMap{T}

Mapping between variable and constraint reference of a model and its copy. The reference of the copied model can be obtained by indexing the map with the reference of the corresponding reference of the original model.

source

GenericVariableRef

JuMP.GenericVariableRefType
GenericVariableRef{T} <: AbstractVariableRef

Holds a reference to the model and the corresponding MOI.VariableIndex.

source

GreaterThanZero

JuMP.GreaterThanZeroType
GreaterThanZero()

A struct used to intercept when >= or is used in a macro via operator_to_set.

This struct is not the same as Nonnegatives so that we can disambiguate x >= y and x - y in Nonnegatives().

This struct is not intended for general usage, but it may be useful to some JuMP extensions.

Example

julia> operator_to_set(error, Val(:>=))
GreaterThanZero()
source

HermitianMatrixAdjointShape

HermitianMatrixShape

JuMP.HermitianMatrixShapeType
HermitianMatrixShape(
    side_dimension::Int;
    needs_adjoint_dual::Bool = false,
)

The shape object for a Hermitian square matrix of side_dimension rows and columns.

The vectorized form corresponds to MOI.HermitianPositiveSemidefiniteConeTriangle.

needs_adjoint_dual

By default, the dual_shape of HermitianMatrixShape is also HermitianMatrixShape. This is true for cases such as a LinearAlgebra.Hermitian matrix in HermitianPSDCone.

However, JuMP also supports LinearAlgebra.Hermitian matrix in Zeros, which is interpreted as an element-wise equality constraint. By exploiting symmetry, we pass only the upper triangle of the equality constraints. This works for the primal, but it leads to a factor of 2 difference in the off-diagonal dual elements. (The dual value of the (i, j) element in the triangle formulation should be divided by 2 when spread across the (i, j) and (j, i) elements in the square matrix formulation.) If the constraint has this dual inconsistency, set needs_adjoint_dual = true.

source

HermitianMatrixSpace

JuMP.HermitianMatrixSpaceType
HermitianMatrixSpace()

Use in the @variable macro to constrain a matrix of variables to be hermitian.

Example

julia> model = Model();

julia> @variable(model, Q[1:2, 1:2] in HermitianMatrixSpace())
2×2 LinearAlgebra.Hermitian{GenericAffExpr{ComplexF64, VariableRef}, Matrix{GenericAffExpr{ComplexF64, VariableRef}}}:
 real(Q[1,1])                    real(Q[1,2]) + imag(Q[1,2]) im
 real(Q[1,2]) - imag(Q[1,2]) im  real(Q[2,2])
source

HermitianPSDCone

JuMP.HermitianPSDConeType
HermitianPSDCone

Hermitian positive semidefinite cone object that can be used to create a Hermitian positive semidefinite square matrix in the @variable and @constraint macros.

Example

Consider the following example:

julia> model = Model();

julia> @variable(model, H[1:3, 1:3] in HermitianPSDCone())
3×3 LinearAlgebra.Hermitian{GenericAffExpr{ComplexF64, VariableRef}, Matrix{GenericAffExpr{ComplexF64, VariableRef}}}:
 real(H[1,1])                    …  real(H[1,3]) + imag(H[1,3]) im
 real(H[1,2]) - imag(H[1,2]) im     real(H[2,3]) + imag(H[2,3]) im
 real(H[1,3]) - imag(H[1,3]) im     real(H[3,3])

julia> all_variables(model)
9-element Vector{VariableRef}:
 real(H[1,1])
 real(H[1,2])
 real(H[2,2])
 real(H[1,3])
 real(H[2,3])
 real(H[3,3])
 imag(H[1,2])
 imag(H[1,3])
 imag(H[2,3])

julia> all_constraints(model, Vector{VariableRef}, MOI.HermitianPositiveSemidefiniteConeTriangle)
1-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.VectorOfVariables, MathOptInterface.HermitianPositiveSemidefiniteConeTriangle}}}:
 [real(H[1,1]), real(H[1,2]), real(H[2,2]), real(H[1,3]), real(H[2,3]), real(H[3,3]), imag(H[1,2]), imag(H[1,3]), imag(H[2,3])] ∈ MathOptInterface.HermitianPositiveSemidefiniteConeTriangle(3)

We see in the output of the last commands that 9 real variables were created. The matrix H constrains affine expressions in terms of these 9 variables that parametrize a Hermitian matrix.

source

IntegerRef

JuMP.IntegerRefFunction
IntegerRef(v::GenericVariableRef)

Return a constraint reference to the constraint constraining v to be integer.

Errors if one does not exist.

See also is_integer, set_integer, unset_integer.

Example

julia> model = Model();

julia> @variable(model, x, Int);

julia> IntegerRef(x)
x integer
source

LPMatrixData

LessThanZero

JuMP.LessThanZeroType
GreaterThanZero()

A struct used to intercept when <= or is used in a macro via operator_to_set.

This struct is not the same as Nonpositives so that we can disambiguate x <= y and x - y in Nonpositives().

This struct is not intended for general usage, but it may be useful to some JuMP extensions.

Example

julia> operator_to_set(error, Val(:<=))
LessThanZero()
source

LinearTermIterator

JuMP.LinearTermIteratorType
LinearTermIterator{GAE<:GenericAffExpr}

A struct that implements the iterate protocol in order to iterate over tuples of (coefficient, variable) in the GenericAffExpr.

source

LowerBoundRef

Model

JuMP.ModelType
Model([optimizer_factory;] add_bridges::Bool = true)

Create a new instance of a JuMP model.

If optimizer_factory is provided, the model is initialized with thhe optimizer returned by MOI.instantiate(optimizer_factory).

If optimizer_factory is not provided, use set_optimizer to set the optimizer before calling optimize!.

If add_bridges, JuMP adds a MOI.Bridges.LazyBridgeOptimizer to automatically reformulate the problem into a form supported by the optimizer.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> solver_name(model)
"Ipopt"

julia> import HiGHS

julia> import MultiObjectiveAlgorithms as MOA

julia> model = Model(() -> MOA.Optimizer(HiGHS.Optimizer); add_bridges = false);
source

ModelMode

JuMP.ModelModeType
ModelMode

An enum to describe the state of the CachingOptimizer inside a JuMP model.

See also: mode.

Values

Possible values are:

  • [AUTOMATIC]: moi_backend field holds a CachingOptimizer in AUTOMATIC mode.
  • [MANUAL]: moi_backend field holds a CachingOptimizer in MANUAL mode.
  • [DIRECT]: moi_backend field holds an AbstractOptimizer. No extra copy of the model is stored. The moi_backend must support add_constraint etc.
source

NLPEvaluator

JuMP.NLPEvaluatorFunction
NLPEvaluator(
    model::Model,
    _differentiation_backend::MOI.Nonlinear.AbstractAutomaticDifferentiation =
        MOI.Nonlinear.SparseReverseMode(),
)

Return an MOI.AbstractNLPEvaluator constructed from model

Warning

Before using, you must initialize the evaluator using MOI.initialize.

Experimental

These features may change or be removed in any future version of JuMP.

Pass _differentiation_backend to specify the differentiation backend used to compute derivatives.

source

NoOptimizer

JuMP.NoOptimizerType
struct NoOptimizer <: Exception end

An error thrown when no optimizer is set and one is required.

The optimizer can be provided to the Model constructor or by calling set_optimizer.

Example

julia> model = Model();

julia> optimize!(model)
ERROR: NoOptimizer()
Stacktrace:
[...]
source

NonlinearExpr

NonlinearOperator

JuMP.NonlinearOperatorType
NonlinearOperator(func::Function, head::Symbol)

A callable struct (functor) representing a function named head.

When called with AbstractJuMPScalars, the struct returns a GenericNonlinearExpr.

When called with non-JuMP types, the struct returns the evaluation of func(args...).

Unless head is special-cased by the optimizer, the operator must have already been added to the model using add_nonlinear_operator or @operator.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> f(x::Float64) = x^2
f (generic function with 1 method)

julia> ∇f(x::Float64) = 2 * x
∇f (generic function with 1 method)

julia> ∇²f(x::Float64) = 2.0
∇²f (generic function with 1 method)

julia> @operator(model, op_f, 1, f, ∇f, ∇²f)
NonlinearOperator(f, :op_f)

julia> bar = NonlinearOperator(f, :op_f)
NonlinearOperator(f, :op_f)

julia> @objective(model, Min, bar(x))
op_f(x)

julia> bar(2.0)
4.0
source

Nonnegatives

JuMP.NonnegativesType
Nonnegatives()

The JuMP equivalent of the MOI.Nonnegatives set, in which the dimension is inferred from the corresponding function.

Example

julia> model = Model();

julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
 x[1]
 x[2]

julia> @constraint(model, x in Nonnegatives())
[x[1], x[2]] ∈ Nonnegatives()

julia> A = [1 2; 3 4];

julia> b = [5, 6];

julia> @constraint(model, A * x >= b)
[x[1] + 2 x[2] - 5, 3 x[1] + 4 x[2] - 6] ∈ Nonnegatives()
source

Nonpositives

JuMP.NonpositivesType
Nonpositives()

The JuMP equivalent of the MOI.Nonpositives set, in which the dimension is inferred from the corresponding function.

Example

julia> model = Model();

julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
 x[1]
 x[2]

julia> @constraint(model, x in Nonpositives())
[x[1], x[2]] ∈ Nonpositives()

julia> A = [1 2; 3 4];

julia> b = [5, 6];

julia> @constraint(model, A * x <= b)
[x[1] + 2 x[2] - 5, 3 x[1] + 4 x[2] - 6] ∈ Nonpositives()
source

OptimizationSense

OptimizeNotCalled

JuMP.OptimizeNotCalledType
struct OptimizeNotCalled <: Exception end

An error thrown when a result attribute cannot be queried before optimize! is called.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> objective_value(model)
ERROR: OptimizeNotCalled()
Stacktrace:
[...]
source

PSDCone

JuMP.PSDConeType
PSDCone

Positive semidefinite cone object that can be used to constrain a square matrix to be positive semidefinite in the @constraint macro.

If the matrix has type Symmetric then the columns vectorization (the vector obtained by concatenating the columns) of its upper triangular part is constrained to belong to the MOI.PositiveSemidefiniteConeTriangle set, otherwise its column vectorization is constrained to belong to the MOI.PositiveSemidefiniteConeSquare set.

Example

Non-symmetric case:

julia> model = Model();

julia> @variable(model, x);

julia> a = [x 2x; 2x x];

julia> b = [1 2; 2 4];

julia> cref = @constraint(model, a >= b, PSDCone())
[x - 1    2 x - 2
 2 x - 2  x - 4] ∈ PSDCone()

julia> jump_function(constraint_object(cref))
4-element Vector{AffExpr}:
 x - 1
 2 x - 2
 2 x - 2
 x - 4

julia> moi_set(constraint_object(cref))
MathOptInterface.PositiveSemidefiniteConeSquare(2)

Symmetric case:

julia> using LinearAlgebra # For Symmetric

julia> model = Model();

julia> @variable(model, x);

julia> a = [x 2x; 2x x];

julia> b = [1 2; 2 4];

julia> cref = @constraint(model, Symmetric(a - b) in PSDCone())
[x - 1  2 x - 2
 ⋯      x - 4] ∈ PSDCone()

julia> jump_function(constraint_object(cref))
3-element Vector{AffExpr}:
 x - 1
 2 x - 2
 x - 4

julia> moi_set(constraint_object(cref))
MathOptInterface.PositiveSemidefiniteConeTriangle(2)
source

Parameter

JuMP.ParameterType
Parameter(value)

A short-cut for the MOI.Parameter set.

Example

julia> model = Model();

julia> @variable(model, x in Parameter(2))
x

julia> print(model)
Feasibility
Subject to
 x ∈ MathOptInterface.Parameter{Float64}(2.0)
source

ParameterRef

JuMP.ParameterRefFunction
ParameterRef(x::GenericVariableRef)

Return a constraint reference to the constraint constraining x to be a parameter.

Errors if one does not exist.

See also is_parameter, set_parameter_value, parameter_value.

Example

julia> model = Model();

julia> @variable(model, p in Parameter(2))
p

julia> ParameterRef(p)
p ∈ MathOptInterface.Parameter{Float64}(2.0)

julia> @variable(model, x);

julia> ParameterRef(x)
ERROR: Variable x is not a parameter.
Stacktrace:
[...]
source

QuadExpr

QuadTermIterator

JuMP.QuadTermIteratorType
QuadTermIterator{GQE<:GenericQuadExpr}

A struct that implements the iterate protocol in order to iterate over tuples of (coefficient, variable, variable) in the GenericQuadExpr.

source

ReferenceMap

JuMP.ReferenceMapType
GenericReferenceMap{T}

Mapping between variable and constraint reference of a model and its copy. The reference of the copied model can be obtained by indexing the map with the reference of the corresponding reference of the original model.

source

ResultStatusCode

JuMP.ResultStatusCodeType
ResultStatusCode

An Enum of possible values for the PrimalStatus and DualStatus attributes.

The values indicate how to interpret the result vector.

Values

Possible values are:

  • NO_SOLUTION: the result vector is empty.
  • FEASIBLE_POINT: the result vector is a feasible point.
  • NEARLY_FEASIBLE_POINT: the result vector is feasible if some constraint tolerances are relaxed.
  • INFEASIBLE_POINT: the result vector is an infeasible point.
  • INFEASIBILITY_CERTIFICATE: the result vector is an infeasibility certificate. If the PrimalStatus is INFEASIBILITY_CERTIFICATE, then the primal result vector is a certificate of dual infeasibility. If the DualStatus is INFEASIBILITY_CERTIFICATE, then the dual result vector is a proof of primal infeasibility.
  • NEARLY_INFEASIBILITY_CERTIFICATE: the result satisfies a relaxed criterion for a certificate of infeasibility.
  • REDUCTION_CERTIFICATE: the result vector is an ill-posed certificate; see this article for details. If the PrimalStatus is REDUCTION_CERTIFICATE, then the primal result vector is a proof that the dual problem is ill-posed. If the DualStatus is REDUCTION_CERTIFICATE, then the dual result vector is a proof that the primal is ill-posed.
  • NEARLY_REDUCTION_CERTIFICATE: the result satisfies a relaxed criterion for an ill-posed certificate.
  • UNKNOWN_RESULT_STATUS: the result vector contains a solution with an unknown interpretation.
  • OTHER_RESULT_STATUS: the result vector contains a solution with an interpretation not covered by one of the statuses defined above
source

RotatedSecondOrderCone

JuMP.RotatedSecondOrderConeType
RotatedSecondOrderCone

Rotated second order cone object that can be used to constrain the square of the euclidean norm of a vector x to be less than or equal to $2tu$ where t and u are nonnegative scalars. This is a shortcut for the MOI.RotatedSecondOrderCone.

Example

The following constrains $\|(x-1, x-2)\|^2_2 \le 2tx$ and $t, x \ge 0$:

julia> model = Model();

julia> @variable(model, x)
x

julia> @variable(model, t)
t

julia> @constraint(model, [t, x, x-1, x-2] in RotatedSecondOrderCone())
[t, x, x - 1, x - 2] ∈ MathOptInterface.RotatedSecondOrderCone(4)
source

SOS1

JuMP.SOS1Type
SOS1(weights = Real[])

The SOS1 (Special Ordered Set of Type 1) set constrains a vector x to the set where at most one variable can take a non-zero value, and all other elements are zero.

The weights vector, if specified, induces an ordering of the variables; as such, it should contain unique values. The weights vector must have the same number of elements as the vector x, and the element weights[i] corresponds to element x[i]. If not provided, the weights vector defaults to weights[i] = i.

This is a shortcut for the MOI.SOS1 set.

Example

julia> model = Model();

julia> @variable(model, x[1:3] in SOS1([4.1, 3.2, 5.0]))
3-element Vector{VariableRef}:
 x[1]
 x[2]
 x[3]

julia> print(model)
Feasibility
Subject to
 [x[1], x[2], x[3]] ∈ MathOptInterface.SOS1{Float64}([4.1, 3.2, 5.0])
source

SOS2

JuMP.SOS2Type
SOS2(weights = Real[])

The SOS2 (Special Ordered Set of Type 2) set constrains a vector x to the set where at most two variables can take a non-zero value, and all other elements are zero. In addition, the two non-zero values must be consecutive given the ordering of the x vector induced by weights.

The weights vector, if specified, induces an ordering of the variables; as such, it must contain unique values. The weights vector must have the same number of elements as the vector x, and the element weights[i] corresponds to element x[i]. If not provided, the weights vector defaults to weights[i] = i.

This is a shortcut for the MOI.SOS2 set.

Example

julia> model = Model();

julia> @variable(model, x[1:3] in SOS2([4.1, 3.2, 5.0]))
3-element Vector{VariableRef}:
 x[1]
 x[2]
 x[3]

julia> print(model)
Feasibility
Subject to
 [x[1], x[2], x[3]] ∈ MathOptInterface.SOS2{Float64}([4.1, 3.2, 5.0])
source

ScalarConstraint

JuMP.ScalarConstraintType
struct ScalarConstraint

The data for a scalar constraint.

See also the documentation on JuMP's representation of constraints for more background.

Fields

  • .func: field contains a JuMP object representing the function
  • .set: field contains the MOI set

Example

A scalar constraint:

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, c, 2x <= 1)
c : 2 x ≤ 1

julia> object = constraint_object(c)
ScalarConstraint{AffExpr, MathOptInterface.LessThan{Float64}}(2 x, MathOptInterface.LessThan{Float64}(1.0))

julia> typeof(object)
ScalarConstraint{AffExpr, MathOptInterface.LessThan{Float64}}

julia> object.func
2 x

julia> object.set
MathOptInterface.LessThan{Float64}(1.0)
source

ScalarShape

JuMP.ScalarShapeType
ScalarShape()

An AbstractShape that represents scalar constraints.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> c = @constraint(model, x[2] <= 1);

julia> shape(constraint_object(c))
ScalarShape()
source

ScalarVariable

SecondOrderCone

JuMP.SecondOrderConeType
SecondOrderCone

Second order cone object that can be used to constrain the euclidean norm of a vector x to be less than or equal to a nonnegative scalar t. This is a shortcut for the MOI.SecondOrderCone.

Example

The following constrains $\|(x-1, x-2)\|_2 \le t$ and $t \ge 0$:

julia> model = Model();

julia> @variable(model, x)
x

julia> @variable(model, t)
t

julia> @constraint(model, [t, x-1, x-2] in SecondOrderCone())
[t, x - 1, x - 2] ∈ MathOptInterface.SecondOrderCone(3)
source

Semicontinuous

JuMP.SemicontinuousType
Semicontinuous(lower, upper)

A short-cut for the MOI.Semicontinuous set.

This short-cut is useful because it automatically promotes lower and upper to the same type, and converts them into the element type supported by the JuMP model.

Example

julia> model = Model();

julia> @variable(model, x in Semicontinuous(1, 2))
x

julia> print(model)
Feasibility
Subject to
 x ∈ MathOptInterface.Semicontinuous{Int64}(1, 2)
source

Semiinteger

JuMP.SemiintegerType
Semiinteger(lower, upper)

A short-cut for the MOI.Semiinteger set.

This short-cut is useful because it automatically promotes lower and upper to the same type, and converts them into the element type supported by the JuMP model.

Example

julia> model = Model();

julia> @variable(model, x in Semiinteger(3, 5))
x

julia> print(model)
Feasibility
Subject to
 x ∈ MathOptInterface.Semiinteger{Int64}(3, 5)
source

SensitivityReport

SkewSymmetricMatrixShape

JuMP.SkewSymmetricMatrixShapeType
SkewSymmetricMatrixShape

Shape object for a skew symmetric square matrix of side_dimension rows and columns. The vectorized form contains the entries of the upper-right triangular part of the matrix (without the diagonal) given column by column (or equivalently, the entries of the lower-left triangular part given row by row). The diagonal is zero.

source

SkewSymmetricMatrixSpace

JuMP.SkewSymmetricMatrixSpaceType
SkewSymmetricMatrixSpace()

Use in the @variable macro to constrain a matrix of variables to be skew-symmetric.

Example

julia> model = Model();

julia> @variable(model, Q[1:2, 1:2] in SkewSymmetricMatrixSpace())
2×2 Matrix{AffExpr}:
 0        Q[1,2]
 -Q[1,2]  0
source

SkipModelConvertScalarSetWrapper

JuMP.SkipModelConvertScalarSetWrapperType
SkipModelConvertScalarSetWrapper(set::MOI.AbstractScalarSet)

JuMP uses model_convert to automatically promote MOI.AbstractScalarSet sets to the same value_type as the model.

In cases there this is undesirable, wrap the set in SkipModelConvertScalarSetWrapper to pass the set un-changed to the solver.

Warning

This struct is intended for use internally by JuMP extensions. You should not need to use it in regular JuMP code.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @constraint(model, x in MOI.EqualTo(1 // 2))
x = 0.5

julia> @constraint(model, x in SkipModelConvertScalarSetWrapper(MOI.EqualTo(1 // 2)))
x = 1//2
source

SquareMatrixShape

JuMP.SquareMatrixShapeType
SquareMatrixShape

Shape object for a square matrix of side_dimension rows and columns. The vectorized form contains the entries of the matrix given column by column (or equivalently, the entries of the lower-left triangular part given row by row).

source

SymmetricMatrixAdjointShape

SymmetricMatrixShape

JuMP.SymmetricMatrixShapeType
SymmetricMatrixShape(
    side_dimension::Int;
    needs_adjoint_dual::Bool = false,
)

The shape object for a symmetric square matrix of side_dimension rows and columns.

The vectorized form contains the entries of the upper-right triangular part of the matrix given column by column (or equivalently, the entries of the lower-left triangular part given row by row).

needs_adjoint_dual

By default, the dual_shape of SymmetricMatrixShape is also SymmetricMatrixShape. This is true for cases such as a LinearAlgebra.Symmetric matrix in PSDCone.

However, JuMP also supports LinearAlgebra.Symmetric matrix in Zeros, which is interpreted as an element-wise equality constraint. By exploiting symmetry, we pass only the upper triangle of the equality constraints. This works for the primal, but it leads to a factor of 2 difference in the off-diagonal dual elements. (The dual value of the (i, j) element in the triangle formulation should be divided by 2 when spread across the (i, j) and (j, i) elements in the square matrix formulation.) If the constraint has this dual inconsistency, set needs_adjoint_dual = true.

source

SymmetricMatrixSpace

JuMP.SymmetricMatrixSpaceType
SymmetricMatrixSpace()

Use in the @variable macro to constrain a matrix of variables to be symmetric.

Example

julia> model = Model();

julia> @variable(model, Q[1:2, 1:2] in SymmetricMatrixSpace())
2×2 LinearAlgebra.Symmetric{VariableRef, Matrix{VariableRef}}:
 Q[1,1]  Q[1,2]
 Q[1,2]  Q[2,2]
source

TerminationStatusCode

JuMP.TerminationStatusCodeType
TerminationStatusCode

An Enum of possible values for the TerminationStatus attribute. This attribute is meant to explain the reason why the optimizer stopped executing in the most recent call to optimize!.

Values

Possible values are:

  • OPTIMIZE_NOT_CALLED: The algorithm has not started.
  • OPTIMAL: The algorithm found a globally optimal solution.
  • INFEASIBLE: The algorithm concluded that no feasible solution exists.
  • DUAL_INFEASIBLE: The algorithm concluded that no dual bound exists for the problem. If, additionally, a feasible (primal) solution is known to exist, this status typically implies that the problem is unbounded, with some technical exceptions.
  • LOCALLY_SOLVED: The algorithm converged to a stationary point, local optimal solution, could not find directions for improvement, or otherwise completed its search without global guarantees.
  • LOCALLY_INFEASIBLE: The algorithm converged to an infeasible point or otherwise completed its search without finding a feasible solution, without guarantees that no feasible solution exists.
  • INFEASIBLE_OR_UNBOUNDED: The algorithm stopped because it decided that the problem is infeasible or unbounded; this occasionally happens during MIP presolve.
  • ALMOST_OPTIMAL: The algorithm found a globally optimal solution to relaxed tolerances.
  • ALMOST_INFEASIBLE: The algorithm concluded that no feasible solution exists within relaxed tolerances.
  • ALMOST_DUAL_INFEASIBLE: The algorithm concluded that no dual bound exists for the problem within relaxed tolerances.
  • ALMOST_LOCALLY_SOLVED: The algorithm converged to a stationary point, local optimal solution, or could not find directions for improvement within relaxed tolerances.
  • ITERATION_LIMIT: An iterative algorithm stopped after conducting the maximum number of iterations.
  • TIME_LIMIT: The algorithm stopped after a user-specified computation time.
  • NODE_LIMIT: A branch-and-bound algorithm stopped because it explored a maximum number of nodes in the branch-and-bound tree.
  • SOLUTION_LIMIT: The algorithm stopped because it found the required number of solutions. This is often used in MIPs to get the solver to return the first feasible solution it encounters.
  • MEMORY_LIMIT: The algorithm stopped because it ran out of memory.
  • OBJECTIVE_LIMIT: The algorithm stopped because it found a solution better than a minimum limit set by the user.
  • NORM_LIMIT: The algorithm stopped because the norm of an iterate became too large.
  • OTHER_LIMIT: The algorithm stopped due to a limit not covered by one of the _LIMIT_ statuses above.
  • SLOW_PROGRESS: The algorithm stopped because it was unable to continue making progress towards the solution.
  • NUMERICAL_ERROR: The algorithm stopped because it encountered unrecoverable numerical error.
  • INVALID_MODEL: The algorithm stopped because the model is invalid.
  • INVALID_OPTION: The algorithm stopped because it was provided an invalid option.
  • INTERRUPTED: The algorithm stopped because of an interrupt signal.
  • OTHER_ERROR: The algorithm stopped because of an error not covered by one of the statuses defined above.
source

UnorderedPair

JuMP.UnorderedPairType
UnorderedPair(a::T, b::T)

A wrapper type used by GenericQuadExpr with fields .a and .b.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> expr = 2.0 * x[1] * x[2]
2 x[1]*x[2]

julia> expr.terms
OrderedCollections.OrderedDict{UnorderedPair{VariableRef}, Float64} with 1 entry:
  UnorderedPair{VariableRef}(x[1], x[2]) => 2.0
source

UpperBoundRef

VariableConstrainedOnCreation

JuMP.VariableConstrainedOnCreationType
VariableConstrainedOnCreation <: AbstractVariable

Variable scalar_variables constrained to belong to set.

Adding this variable can be understood as doing:

function JuMP.add_variable(
    model::GenericModel,
    variable::VariableConstrainedOnCreation,
    names,
)
    var_ref = add_variable(model, variable.scalar_variable, name)
    add_constraint(model, VectorConstraint(var_ref, variable.set))
    return var_ref
end

but adds the variables with MOI.add_constrained_variable(model, variable.set) instead.

source

VariableInfo

JuMP.VariableInfoType
VariableInfo{S,T,U,V}

A struct by JuMP internally when creating variables. This may also be used by JuMP extensions to create new types of variables.

See also: ScalarVariable.

source

VariableNotOwned

JuMP.VariableNotOwnedType
struct VariableNotOwned{V<:AbstractVariableRef} <: Exception
    variable::V
end

The variable variable was used in a model different to owner_model(variable).

source

VariableRef

JuMP.VariableRefType
GenericVariableRef{T} <: AbstractVariableRef

Holds a reference to the model and the corresponding MOI.VariableIndex.

source

VariablesConstrainedOnCreation

JuMP.VariablesConstrainedOnCreationType
VariablesConstrainedOnCreation <: AbstractVariable

Vector of variables scalar_variables constrained to belong to set. Adding this variable can be thought as doing:

function JuMP.add_variable(
    model::GenericModel,
    variable::VariablesConstrainedOnCreation,
    names,
)
    v_names = vectorize(names, variable.shape)
    var_refs = add_variable.(model, variable.scalar_variables, v_names)
    add_constraint(model, VectorConstraint(var_refs, variable.set))
    return reshape_vector(var_refs, variable.shape)
end

but adds the variables with MOI.add_constrained_variables(model, variable.set) instead. See the MOI documentation for the difference between adding the variables with MOI.add_constrained_variables and adding them with MOI.add_variables and adding the constraint separately.

source

VectorConstraint

JuMP.VectorConstraintType
struct VectorConstraint

The data for a vector constraint.

See also the documentation on JuMP's representation of constraints.

Fields

  • func: field contains a JuMP object representing the function
  • set: field contains the MOI set.
  • shape: field contains an AbstractShape matching the form in which the constraint was constructed (for example, by using matrices or flat vectors).

Example

julia> model = Model();

julia> @variable(model, x[1:3]);

julia> @constraint(model, c, x in SecondOrderCone())
c : [x[1], x[2], x[3]] ∈ MathOptInterface.SecondOrderCone(3)

julia> object = constraint_object(c)
VectorConstraint{VariableRef, MathOptInterface.SecondOrderCone, VectorShape}(VariableRef[x[1], x[2], x[3]], MathOptInterface.SecondOrderCone(3), VectorShape())

julia> typeof(object)
VectorConstraint{VariableRef, MathOptInterface.SecondOrderCone, VectorShape}

julia> object.func
3-element Vector{VariableRef}:
 x[1]
 x[2]
 x[3]

julia> object.set
MathOptInterface.SecondOrderCone(3)

julia> object.shape
VectorShape()
source

VectorShape

JuMP.VectorShapeType
VectorShape()

An AbstractShape that represents vector-valued constraints.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> c = @constraint(model, x in SOS1());

julia> shape(constraint_object(c))
VectorShape()
source

Zeros

JuMP.ZerosType
Zeros()

The JuMP equivalent of the MOI.Zeros set, in which the dimension is inferred from the corresponding function.

Example

julia> model = Model();

julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
 x[1]
 x[2]

julia> @constraint(model, x in Zeros())
[x[1], x[2]] ∈ Zeros()

julia> A = [1 2; 3 4];

julia> b = [5, 6];

julia> @constraint(model, A * x == b)
[x[1] + 2 x[2] - 5, 3 x[1] + 4 x[2] - 6] ∈ Zeros()
source

ALMOST_DUAL_INFEASIBLE

ALMOST_INFEASIBLE

ALMOST_LOCALLY_SOLVED

JuMP.ALMOST_LOCALLY_SOLVEDConstant
ALMOST_LOCALLY_SOLVED::TerminationStatusCode

An instance of the TerminationStatusCode enum.

ALMOST_LOCALLY_SOLVED: The algorithm converged to a stationary point, local optimal solution, or could not find directions for improvement within relaxed tolerances.

source

ALMOST_OPTIMAL

AUTOMATIC

DIRECT

JuMP.DIRECTConstant

moi_backend field holds an AbstractOptimizer. No extra copy of the model is stored. The moi_backend must support add_constraint etc.

source

DUAL_INFEASIBLE

JuMP.DUAL_INFEASIBLEConstant
DUAL_INFEASIBLE::TerminationStatusCode

An instance of the TerminationStatusCode enum.

DUAL_INFEASIBLE: The algorithm concluded that no dual bound exists for the problem. If, additionally, a feasible (primal) solution is known to exist, this status typically implies that the problem is unbounded, with some technical exceptions.

source

FEASIBILITY_SENSE

FEASIBLE_POINT

INFEASIBILITY_CERTIFICATE

JuMP.INFEASIBILITY_CERTIFICATEConstant
INFEASIBILITY_CERTIFICATE::ResultStatusCode

An instance of the ResultStatusCode enum.

INFEASIBILITY_CERTIFICATE: the result vector is an infeasibility certificate. If the PrimalStatus is INFEASIBILITY_CERTIFICATE, then the primal result vector is a certificate of dual infeasibility. If the DualStatus is INFEASIBILITY_CERTIFICATE, then the dual result vector is a proof of primal infeasibility.

source

INFEASIBLE

INFEASIBLE_OR_UNBOUNDED

JuMP.INFEASIBLE_OR_UNBOUNDEDConstant
INFEASIBLE_OR_UNBOUNDED::TerminationStatusCode

An instance of the TerminationStatusCode enum.

INFEASIBLE_OR_UNBOUNDED: The algorithm stopped because it decided that the problem is infeasible or unbounded; this occasionally happens during MIP presolve.

source

INFEASIBLE_POINT

INTERRUPTED

INVALID_MODEL

INVALID_OPTION

ITERATION_LIMIT

LOCALLY_INFEASIBLE

JuMP.LOCALLY_INFEASIBLEConstant
LOCALLY_INFEASIBLE::TerminationStatusCode

An instance of the TerminationStatusCode enum.

LOCALLY_INFEASIBLE: The algorithm converged to an infeasible point or otherwise completed its search without finding a feasible solution, without guarantees that no feasible solution exists.

source

LOCALLY_SOLVED

JuMP.LOCALLY_SOLVEDConstant
LOCALLY_SOLVED::TerminationStatusCode

An instance of the TerminationStatusCode enum.

LOCALLY_SOLVED: The algorithm converged to a stationary point, local optimal solution, could not find directions for improvement, or otherwise completed its search without global guarantees.

source

MANUAL

JuMP.MANUALConstant

moi_backend field holds a CachingOptimizer in MANUAL mode.

source

MAX_SENSE

MEMORY_LIMIT

MIN_SENSE

NEARLY_FEASIBLE_POINT

NEARLY_INFEASIBILITY_CERTIFICATE

NEARLY_REDUCTION_CERTIFICATE

NODE_LIMIT

JuMP.NODE_LIMITConstant
NODE_LIMIT::TerminationStatusCode

An instance of the TerminationStatusCode enum.

NODE_LIMIT: A branch-and-bound algorithm stopped because it explored a maximum number of nodes in the branch-and-bound tree.

source

NORM_LIMIT

NO_SOLUTION

NUMERICAL_ERROR

OBJECTIVE_LIMIT

OPTIMAL

OPTIMIZE_NOT_CALLED

OTHER_ERROR

JuMP.OTHER_ERRORConstant
OTHER_ERROR::TerminationStatusCode

An instance of the TerminationStatusCode enum.

OTHER_ERROR: The algorithm stopped because of an error not covered by one of the statuses defined above.

source

OTHER_LIMIT

OTHER_RESULT_STATUS

JuMP.OTHER_RESULT_STATUSConstant
OTHER_RESULT_STATUS::ResultStatusCode

An instance of the ResultStatusCode enum.

OTHER_RESULT_STATUS: the result vector contains a solution with an interpretation not covered by one of the statuses defined above

source

REDUCTION_CERTIFICATE

JuMP.REDUCTION_CERTIFICATEConstant
REDUCTION_CERTIFICATE::ResultStatusCode

An instance of the ResultStatusCode enum.

REDUCTION_CERTIFICATE: the result vector is an ill-posed certificate; see this article for details. If the PrimalStatus is REDUCTION_CERTIFICATE, then the primal result vector is a proof that the dual problem is ill-posed. If the DualStatus is REDUCTION_CERTIFICATE, then the dual result vector is a proof that the primal is ill-posed.

source

SLOW_PROGRESS

JuMP.SLOW_PROGRESSConstant
SLOW_PROGRESS::TerminationStatusCode

An instance of the TerminationStatusCode enum.

SLOW_PROGRESS: The algorithm stopped because it was unable to continue making progress towards the solution.

source

SOLUTION_LIMIT

JuMP.SOLUTION_LIMITConstant
SOLUTION_LIMIT::TerminationStatusCode

An instance of the TerminationStatusCode enum.

SOLUTION_LIMIT: The algorithm stopped because it found the required number of solutions. This is often used in MIPs to get the solver to return the first feasible solution it encounters.

source

TIME_LIMIT

UNKNOWN_RESULT_STATUS

op_and

JuMP.op_andConstant
op_and(x, y)

A function that falls back to x & y, but when called with JuMP variables or expressions, returns a GenericNonlinearExpr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> op_and(true, false)
false

julia> op_and(true, x)
true && x
source

op_equal_to

JuMP.op_equal_toConstant
op_equal_to(x, y)

A function that falls back to x == y, but when called with JuMP variables or expressions, returns a GenericNonlinearExpr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> op_equal_to(2, 2)
true

julia> op_equal_to(x, 2)
x == 2
source

op_greater_than_or_equal_to

JuMP.op_greater_than_or_equal_toConstant
op_greater_than_or_equal_to(x, y)

A function that falls back to x >= y, but when called with JuMP variables or expressions, returns a GenericNonlinearExpr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> op_greater_than_or_equal_to(2, 2)
true

julia> op_greater_than_or_equal_to(x, 2)
x >= 2
source

op_less_than_or_equal_to

JuMP.op_less_than_or_equal_toConstant
op_less_than_or_equal_to(x, y)

A function that falls back to x <= y, but when called with JuMP variables or expressions, returns a GenericNonlinearExpr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> op_less_than_or_equal_to(2, 2)
true

julia> op_less_than_or_equal_to(x, 2)
x <= 2
source

op_or

JuMP.op_orConstant
op_or(x, y)

A function that falls back to x | y, but when called with JuMP variables or expressions, returns a GenericNonlinearExpr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> op_or(true, false)
true

julia> op_or(true, x)
true || x
source

op_strictly_greater_than

JuMP.op_strictly_greater_thanConstant
op_strictly_greater_than(x, y)

A function that falls back to x > y, but when called with JuMP variables or expressions, returns a GenericNonlinearExpr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> op_strictly_greater_than(1, 2)
false

julia> op_strictly_greater_than(x, 2)
x > 2
source

op_strictly_less_than

JuMP.op_strictly_less_thanConstant
op_strictly_less_than(x, y)

A function that falls back to x < y, but when called with JuMP variables or expressions, returns a GenericNonlinearExpr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> op_strictly_less_than(1, 2)
true

julia> op_strictly_less_than(x, 2)
x < 2
source

Base.empty!(::GenericModel)

Base.empty!Method
empty!(model::GenericModel)::GenericModel

Empty the model, that is, remove all variables, constraints and model attributes but not optimizer attributes. Always return the argument.

Note: removes extensions data.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> isempty(model)
false

julia> empty!(model)
A JuMP Model
├ solver: none
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none

julia> print(model)
Feasibility
Subject to

julia> isempty(model)
true
source

Base.isempty(::GenericModel)

Base.isemptyMethod
isempty(model::GenericModel)

Verifies whether the model is empty, that is, whether the MOI backend is empty and whether the model is in the same state as at its creation, apart from optimizer attributes.

Example

julia> model = Model();

julia> isempty(model)
true

julia> @variable(model, x[1:2]);

julia> isempty(model)
false
source

Base.copy(::AbstractModel)

Base.copyMethod
copy(model::AbstractModel)

Return a copy of the model model. It is similar to copy_model except that it does not return the mapping between the references of model and its copy.

Note

Model copy is not supported in DIRECT mode, that is, when a model is constructed using the direct_model constructor instead of the Model constructor. Moreover, independently on whether an optimizer was provided at model construction, the new model will have no optimizer, that is, an optimizer will have to be provided to the new model in the optimize! call.

Example

In the following example, a model model is constructed with a variable x and a constraint cref. It is then copied into a model new_model with the new references assigned to x_new and cref_new.

julia> model = Model();

julia> @variable(model, x)
x

julia> @constraint(model, cref, x == 2)
cref : x = 2

julia> new_model = copy(model);

julia> x_new = model[:x]
x

julia> cref_new = model[:cref]
cref : x = 2
source

Base.write(::IO, ::GenericModel; ::MOI.FileFormats.FileFormat)

Base.writeMethod
Base.write(
    io::IO,
    model::GenericModel;
    format::MOI.FileFormats.FileFormat = MOI.FileFormats.FORMAT_MOF,
    kwargs...,
)

Write the JuMP model model to io in the format format.

Other kwargs are passed to the Model constructor of the chosen format.

source

MOI.Utilities.reset_optimizer(::GenericModel)

MOI.Utilities.drop_optimizer(::GenericModel)

MOI.Utilities.attach_optimizer(::GenericModel)

@NLconstraint

JuMP.@NLconstraintMacro
@NLconstraint(model::GenericModel, expr)

Add a constraint described by the nonlinear expression expr. See also @constraint.

Compat

This macro is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling. In most cases, you can replace @NLconstraint with @constraint.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @NLconstraint(model, sin(x) <= 1)
sin(x) - 1.0 ≤ 0

julia> @NLconstraint(model, [i = 1:3], sin(i * x) <= 1 / i)
3-element Vector{NonlinearConstraintRef{ScalarShape}}:
 (sin(1.0 * x) - 1.0 / 1.0) - 0.0 ≤ 0
 (sin(2.0 * x) - 1.0 / 2.0) - 0.0 ≤ 0
 (sin(3.0 * x) - 1.0 / 3.0) - 0.0 ≤ 0
source

@NLconstraints

JuMP.@NLconstraintsMacro
@NLconstraints(model, args...)

Adds multiple nonlinear constraints to model at once, in the same fashion as the @NLconstraint macro.

The model must be the first argument, and multiple constraints can be added on multiple lines wrapped in a begin ... end block.

The macro returns a tuple containing the constraints that were defined.

Compat

This macro is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling. In most cases, you can replace @NLconstraints with @constraints.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @variable(model, y);

julia> @variable(model, t);

julia> @variable(model, z[1:2]);

julia> a = [4, 5];

julia> @NLconstraints(model, begin
           t >= sqrt(x^2 + y^2)
           [i = 1:2], z[i] <= log(a[i])
       end)
((t - sqrt(x ^ 2.0 + y ^ 2.0)) - 0.0 ≥ 0, NonlinearConstraintRef{ScalarShape}[(z[1] - log(4.0)) - 0.0 ≤ 0, (z[2] - log(5.0)) - 0.0 ≤ 0])
source

@NLexpression

JuMP.@NLexpressionMacro
@NLexpression(args...)

Efficiently build a nonlinear expression which can then be inserted in other nonlinear constraints and the objective. See also [@expression].

Compat

This macro is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling. In most cases, you can replace @NLexpression with @expression.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @variable(model, y)
y

julia> @NLexpression(model, my_expr, sin(x)^2 + cos(x^2))
subexpression[1]: sin(x) ^ 2.0 + cos(x ^ 2.0)

julia> @NLconstraint(model, my_expr + y >= 5)
(subexpression[1] + y) - 5.0 ≥ 0

julia> @NLobjective(model, Min, my_expr)

Indexing over sets and anonymous expressions are also supported:

julia> @NLexpression(model, my_expr_1[i=1:3], sin(i * x))
3-element Vector{NonlinearExpression}:
 subexpression[2]: sin(1.0 * x)
 subexpression[3]: sin(2.0 * x)
 subexpression[4]: sin(3.0 * x)

julia> my_expr_2 = @NLexpression(model, log(1 + sum(exp(my_expr_1[i]) for i in 1:2)))
subexpression[5]: log(1.0 + (exp(subexpression[2]) + exp(subexpression[3])))
source

@NLexpressions

JuMP.@NLexpressionsMacro
@NLexpressions(model, args...)

Adds multiple nonlinear expressions to model at once, in the same fashion as the @NLexpression macro.

The model must be the first argument, and multiple expressions can be added on multiple lines wrapped in a begin ... end block.

The macro returns a tuple containing the expressions that were defined.

Compat

This macro is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling. In most cases, you can replace @NLexpressions with @expressions.

Example

julia> model = Model();

julia> @variable(model, x);

julia> @variable(model, y);

julia> @variable(model, z[1:2]);

julia> a = [4, 5];

julia> @NLexpressions(model, begin
           my_expr, sqrt(x^2 + y^2)
           my_expr_1[i = 1:2], log(a[i]) - z[i]
       end)
(subexpression[1]: sqrt(x ^ 2.0 + y ^ 2.0), NonlinearExpression[subexpression[2]: log(4.0) - z[1], subexpression[3]: log(5.0) - z[2]])
source

@NLobjective

JuMP.@NLobjectiveMacro
@NLobjective(model, sense, expression)

Add a nonlinear objective to model with optimization sense sense. sense must be Max or Min.

Compat

This macro is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling. In most cases, you can replace @NLobjective with @objective.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> @NLobjective(model, Max, 2x + 1 + sin(x))

julia> print(model)
Max 2.0 * x + 1.0 + sin(x)
Subject to
source

@NLparameter

JuMP.@NLparameterMacro
@NLparameter(model, param == value)

Create and return a nonlinear parameter param attached to the model model with initial value set to value. Nonlinear parameters may be used only in nonlinear expressions.

Example

julia> model = Model();

julia> @NLparameter(model, x == 10)
x == 10.0

julia> value(x)
10.0
@NLparameter(model, value = param_value)

Create and return an anonymous nonlinear parameter param attached to the model model with initial value set to param_value. Nonlinear parameters may be used only in nonlinear expressions.

Example

julia> model = Model();

julia> x = @NLparameter(model, value = 10)
parameter[1] == 10.0

julia> value(x)
10.0
@NLparameter(model, param_collection[...] == value_expr)

Create and return a collection of nonlinear parameters param_collection attached to the model model with initial value set to value_expr (may depend on index sets). Uses the same syntax for specifying index sets as @variable.

Example

julia> model = Model();

julia> @NLparameter(model, y[i = 1:3] == 2 * i)
3-element Vector{NonlinearParameter}:
 parameter[1] == 2.0
 parameter[2] == 4.0
 parameter[3] == 6.0

julia> value(y[2])
4.0
@NLparameter(model, [...] == value_expr)

Create and return an anonymous collection of nonlinear parameters attached to the model model with initial value set to value_expr (may depend on index sets). Uses the same syntax for specifying index sets as @variable.

Compat

This macro is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling. In most cases, you can replace a call like @NLparameter(model, p == value) with @variable(model, p in Parameter(value)).

Example

julia> model = Model();

julia> y = @NLparameter(model, [i = 1:3] == 2 * i)
3-element Vector{NonlinearParameter}:
 parameter[1] == 2.0
 parameter[2] == 4.0
 parameter[3] == 6.0

julia> value(y[2])
4.0
source

@NLparameters

JuMP.@NLparametersMacro
 @NLparameters(model, args...)

Create and return multiple nonlinear parameters attached to model model, in the same fashion as @NLparameter macro.

The model must be the first argument, and multiple parameters can be added on multiple lines wrapped in a begin ... end block. Distinct parameters need to be placed on separate lines as in the following example.

The macro returns a tuple containing the parameters that were defined.

Compat

This macro is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling. In most cases, you can replace a call like

@NLparameters(model, begin
    p == value
end)

with

@variables(model, begin
    p in Parameter(value)
end)

Example

julia> model = Model();

julia> @NLparameters(model, begin
           x == 10
           b == 156
       end);

julia> value(x)
10.0
source

add_nonlinear_constraint

JuMP.add_nonlinear_constraintFunction
add_nonlinear_constraint(model::Model, expr::Expr)

Add a nonlinear constraint described by the Julia expression ex to model.

This function is most useful if the expression ex is generated programmatically, and you cannot use @NLconstraint.

Compat

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

Notes

  • You must interpolate the variables directly into the expression expr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> add_nonlinear_constraint(model, :($(x) + $(x)^2 <= 1))
(x + x ^ 2.0) - 1.0 ≤ 0
source

add_nonlinear_expression

JuMP.add_nonlinear_expressionFunction
add_nonlinear_expression(model::Model, expr::Expr)

Add a nonlinear expression expr to model.

This function is most useful if the expression expr is generated programmatically, and you cannot use @NLexpression.

Compat

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

Notes

  • You must interpolate the variables directly into the expression expr.

Example

julia> model = Model();

julia> @variable(model, x);

julia> add_nonlinear_expression(model, :($(x) + $(x)^2))
subexpression[1]: x + x ^ 2.0
source

add_nonlinear_parameter

JuMP.add_nonlinear_parameterFunction
add_nonlinear_parameter(model::Model, value::Real)

Add an anonymous parameter to the model.

Compat

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

source

all_nonlinear_constraints

get_optimizer_attribute

JuMP.get_optimizer_attributeFunction
get_optimizer_attribute(
    model::Union{GenericModel,MOI.OptimizerWithAttributes},
    attr::Union{AbstractString,MOI.AbstractOptimizerAttribute},
)

Return the value associated with the solver-specific attribute attr.

If attr is an AbstractString, this is equivalent to get_optimizer_attribute(model, MOI.RawOptimizerAttribute(name)).

Compat

This method will remain in all v1.X releases of JuMP, but it may be removed in a future v2.0 release. We recommend using get_attribute instead.

See also: set_optimizer_attribute, set_optimizer_attributes.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> get_optimizer_attribute(model, MOI.Silent())
false
source

nonlinear_constraint_string

JuMP.nonlinear_constraint_stringFunction
nonlinear_constraint_string(
    model::GenericModel,
    mode::MIME,
    c::_NonlinearConstraint,
)

Return a string representation of the nonlinear constraint c belonging to model, given the mode.

Compat

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

source

nonlinear_dual_start_value

nonlinear_expr_string

JuMP.nonlinear_expr_stringFunction
nonlinear_expr_string(
    model::GenericModel,
    mode::MIME,
    c::MOI.Nonlinear.Expression,
)

Return a string representation of the nonlinear expression c belonging to model, given the mode.

Compat

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

source

nonlinear_model

JuMP.nonlinear_modelFunction
nonlinear_model(
    model::GenericModel;
    force::Bool = false,
)::Union{MOI.Nonlinear.Model,Nothing}

If model has nonlinear components, return a MOI.Nonlinear.Model, otherwise return nothing.

If force, always return a MOI.Nonlinear.Model, and if one does not exist for the model, create an empty one.

Compat

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

source

num_nonlinear_constraints

register

JuMP.registerFunction
register(
    model::Model,
    op::Symbol,
    dimension::Integer,
    f::Function;
    autodiff:Bool = false,
)

Register the user-defined function f that takes dimension arguments in model as the symbol op.

The function f must support all subtypes of Real as arguments. Do not assume that the inputs are Float64.

Compat

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

Notes

  • For this method, you must explicitly set autodiff = true, because no user-provided gradient function ∇f is given.
  • Second-derivative information is only computed if dimension == 1.
  • op does not have to be the same symbol as f, but it is generally more readable if it is.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> f(x::T) where {T<:Real} = x^2
f (generic function with 1 method)

julia> register(model, :foo, 1, f; autodiff = true)

julia> @NLobjective(model, Min, foo(x))
julia> model = Model();

julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
 x[1]
 x[2]

julia> g(x::T, y::T) where {T<:Real} = x * y
g (generic function with 1 method)

julia> register(model, :g, 2, g; autodiff = true)

julia> @NLobjective(model, Min, g(x[1], x[2]))
source
register(
    model::Model,
    s::Symbol,
    dimension::Integer,
    f::Function,
    ∇f::Function;
    autodiff:Bool = false,
)

Register the user-defined function f that takes dimension arguments in model as the symbol s. In addition, provide a gradient function ∇f.

The functions fand ∇f must support all subtypes of Real as arguments. Do not assume that the inputs are Float64.

Compat

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

Notes

  • If the function f is univariate (that is, dimension == 1), ∇f must return a number which represents the first-order derivative of the function f.
  • If the function f is multi-variate, ∇f must have a signature matching ∇f(g::AbstractVector{T}, args::T...) where {T<:Real}, where the first argument is a vector g that is modified in-place with the gradient.
  • If autodiff = true and dimension == 1, use automatic differentiation to compute the second-order derivative information. If autodiff = false, only first-order derivative information will be used.
  • s does not have to be the same symbol as f, but it is generally more readable if it is.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> f(x::T) where {T<:Real} = x^2
f (generic function with 1 method)

julia> ∇f(x::T) where {T<:Real} = 2 * x
∇f (generic function with 1 method)

julia> register(model, :foo, 1, f, ∇f; autodiff = true)

julia> @NLobjective(model, Min, foo(x))
julia> model = Model();

julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
 x[1]
 x[2]

julia> g(x::T, y::T) where {T<:Real} = x * y
g (generic function with 1 method)

julia> function ∇g(g::AbstractVector{T}, x::T, y::T) where {T<:Real}
           g[1] = y
           g[2] = x
           return
       end
∇g (generic function with 1 method)

julia> register(model, :g, 2, g, ∇g)

julia> @NLobjective(model, Min, g(x[1], x[2]))
source
register(
    model::Model,
    s::Symbol,
    dimension::Integer,
    f::Function,
    ∇f::Function,
    ∇²f::Function,
)

Register the user-defined function f that takes dimension arguments in model as the symbol s. In addition, provide a gradient function ∇f and a hessian function ∇²f.

∇f and ∇²f must return numbers corresponding to the first- and second-order derivatives of the function f respectively.

Compat

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

Notes

  • Because automatic differentiation is not used, you can assume the inputs are all Float64.
  • This method will throw an error if dimension > 1.
  • s does not have to be the same symbol as f, but it is generally more readable if it is.

Example

julia> model = Model();

julia> @variable(model, x)
x

julia> f(x::Float64) = x^2
f (generic function with 1 method)

julia> ∇f(x::Float64) = 2 * x
∇f (generic function with 1 method)

julia> ∇²f(x::Float64) = 2.0
∇²f (generic function with 1 method)

julia> register(model, :foo, 1, f, ∇f, ∇²f)

julia> @NLobjective(model, Min, foo(x))
source

set_nonlinear_dual_start_value

JuMP.set_nonlinear_dual_start_valueFunction
set_nonlinear_dual_start_value(
    model::Model,
    start::Union{Nothing,Vector{Float64}},
)

Set the value of the MOI attribute MOI.NLPBlockDualStart.

Compat

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

The start vector corresponds to the Lagrangian duals of the nonlinear constraints, in the order given by all_nonlinear_constraints. That is, you must pass a single start vector corresponding to all of the nonlinear constraints in a single function call; you cannot set the dual start value of nonlinear constraints one-by-one. The example below demonstrates how to use all_nonlinear_constraints to create a mapping between the nonlinear constraint references and the start vector.

Pass nothing to unset a previous start.

Example

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> nl1 = @NLconstraint(model, x[1] <= sqrt(x[2]));

julia> nl2 = @NLconstraint(model, x[1] >= exp(x[2]));

julia> start = Dict(nl1 => -1.0, nl2 => 1.0);

julia> start_vector = [start[con] for con in all_nonlinear_constraints(model)]
2-element Vector{Float64}:
 -1.0
  1.0

julia> set_nonlinear_dual_start_value(model, start_vector)

julia> nonlinear_dual_start_value(model)
2-element Vector{Float64}:
 -1.0
  1.0
source

set_nonlinear_objective

JuMP.set_nonlinear_objectiveFunction
set_nonlinear_objective(
    model::Model,
    sense::MOI.OptimizationSense,
    expr::Expr,
)

Set the nonlinear objective of model to the expression expr, with the optimization sense sense.

This function is most useful if the expression expr is generated programmatically, and you cannot use @NLobjective.

Compat

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

Notes

  • You must interpolate the variables directly into the expression expr.
  • You must use MIN_SENSE or MAX_SENSE instead of Min and Max.

Example

julia> model = Model();

julia> @variable(model, x);

julia> set_nonlinear_objective(model, MIN_SENSE, :($(x) + $(x)^2))
source

set_normalized_coefficients

JuMP.set_normalized_coefficientsFunction
set_normalized_coefficients(
    constraint::ConstraintRef{<:AbstractModel,<:MOI.ConstraintIndex{F}},
    variable::AbstractVariableRef,
    new_coefficients::Vector{Tuple{Int64,T}},
) where {T,F<:Union{MOI.VectorAffineFunction{T},MOI.VectorQuadraticFunction{T}}}

A deprecated method that now redirects to set_normalized_coefficient.

source

set_optimizer_attribute

JuMP.set_optimizer_attributeFunction
set_optimizer_attribute(
    model::Union{GenericModel,MOI.OptimizerWithAttributes},
    attr::Union{AbstractString,MOI.AbstractOptimizerAttribute},
    value,
)

Set the solver-specific attribute attr in model to value.

If attr is an AbstractString, this is equivalent to set_optimizer_attribute(model, MOI.RawOptimizerAttribute(name), value).

Compat

This method will remain in all v1.X releases of JuMP, but it may be removed in a future v2.0 release. We recommend using set_attribute instead.

See also: set_optimizer_attributes, get_optimizer_attribute.

Example

julia> model = Model();

julia> set_optimizer_attribute(model, MOI.Silent(), true)
source

set_optimizer_attributes

JuMP.set_optimizer_attributesFunction
set_optimizer_attributes(
    model::Union{GenericModel,MOI.OptimizerWithAttributes},
    pairs::Pair...,
)

Given a list of attribute => value pairs, calls set_optimizer_attribute(model, attribute, value) for each pair.

Compat

This method will remain in all v1.X releases of JuMP, but it may be removed in a future v2.0 release. We recommend using set_attributes instead.

See also: set_optimizer_attribute, get_optimizer_attribute.

Example

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> set_optimizer_attributes(model, "tol" => 1e-4, "max_iter" => 100)

is equivalent to:

julia> import Ipopt

julia> model = Model(Ipopt.Optimizer);

julia> set_optimizer_attribute(model, "tol", 1e-4)

julia> set_optimizer_attribute(model, "max_iter", 100)
source

set_value

JuMP.set_valueFunction
set_value(p::NonlinearParameter, v::Number)

Store the value v in the nonlinear parameter p.

Compat

This function is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

Example

julia> model = Model();

julia> @NLparameter(model, p == 0)
p == 0.0

julia> set_value(p, 5)
5

julia> value(p)
5.0
source

NonlinearConstraintIndex

NonlinearConstraintRef

NonlinearExpression

NonlinearParameter

JuMP.NonlinearParameterType
NonlinearParameter <: AbstractJuMPScalar

A struct to represent a nonlinear parameter.

Create a parameter using @NLparameter.

Compat

This type is part of the legacy nonlinear interface. Consider using the new nonlinear interface documented in Nonlinear Modeling.

source