Variables
What is a JuMP variable?
The term variable in mathematical optimization has many meanings. Here, we distinguish between the following three types of variables:
- optimization variables, which are the mathematical $x$ in the problem $\max\{f_0(x) | f_i(x) \in S_i\}$.
- Julia variables, which are bindings between a name and a value, for example
x = 1. (See here for the Julia docs.) - JuMP variables, which are instances of the
VariableRefstruct defined by JuMP that contains a reference to an optimization variable in a model. (Extra for experts: theVariableRefstruct is a thin wrapper around aMOI.VariableIndex, and also contains a reference to the JuMP model.)
To illustrate these three types of variables, consider the following JuMP code (the full syntax is explained below):
julia> model = Model()
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
julia> @variable(model, x[1:2])
2-element Array{VariableRef,1}:
x[1]
x[2]This code does three things:
- It adds two optimization variables to
model. - It creates two JuMP variables that act as references to those optimization variables.
- It binds those JuMP variables as a vector with two elements to the Julia variable
x.
To reduce confusion, we will attempt, where possible, to always refer to variables with their corresponding prefix.
JuMP variables can have attributes, such as names or an initial primal start value. We illustrate the name attribute in the following example:
julia> @variable(model, y, base_name="decision variable")
decision variableThis code does four things:
- It adds one optimization variable to
model. - It creates one JuMP variable that acts as a reference to that optimization variable.
- It binds the JuMP variable to the Julia variable
y. - It tells JuMP that the name attribute of this JuMP variable is "decision variable". JuMP uses the value of
base_namewhen it has to print the variable as a string.
For example, when we print y at the REPL we get:
julia> y
decision variableBecause y is a Julia variable, we can bind it to a different value. For example, if we write:
julia> y = 1
1y is no longer a binding to a JuMP variable. This does not mean that the JuMP variable has been destroyed. It still exists and is still a reference to the same optimization variable. The binding can be reset by querying the model for the symbol as it was written in the @variable macro. For example:
julia> model[:y]
decision variableThis act of looking up the JuMP variable by using the symbol is most useful when composing JuMP models across multiple functions, as illustrated by the following example:
function add_component_to_model(model::JuMP.Model)
x = model[:x]
# ... code that uses x
end
function build_model()
model = Model()
@variable(model, x)
add_component_to_model(model)
endNow that we understand the difference between optimization, JuMP, and Julia variables, we can introduce more of the functionality of the @variable macro.
Variable bounds
We have already seen the basic usage of the @variable macro. The next extension is to add lower- and upper-bounds to each optimization variable. This can be done as follows:
julia> @variable(model, x_free)
x_free
julia> @variable(model, x_lower >= 0)
x_lower
julia> @variable(model, x_upper <= 1)
x_upper
julia> @variable(model, 2 <= x_interval <= 3)
x_interval
julia> @variable(model, x_fixed == 4)
x_fixedIn the above examples, x_free represents an unbounded optimization variable, x_lower represents an optimization variable with a lower bound and so forth.
When creating a variable with only a lower-bound or an upper-bound, and the value of the bound is not a numeric literal, the name of the variable must appear on the left-hand side. Putting the name on the right-hand side will result in an error. For example:
@variable(model, 1 <= x) # works
a = 1
@variable(model, a <= x) # errors
@variable(model, x >= a) # worksWe can query whether an optimization variable has a lower- or upper-bound via the has_lower_bound and has_upper_bound functions. For example:
julia> has_lower_bound(x_free)
false
julia> has_upper_bound(x_upper)
trueIf a variable has a lower or upper bound, we can query the value of it via the lower_bound and upper_bound functions. For example:
julia> lower_bound(x_interval)
2.0
julia> upper_bound(x_interval)
3.0Querying the value of a bound that does not exist will result in an error.
Instead of using the <= and >= syntax, we can also use the lower_bound and upper_bound keyword arguments. For example:
julia> @variable(model, x, lower_bound=1, upper_bound=2)
x
julia> lower_bound(x)
1.0Another option is to use the set_lower_bound and set_upper_bound functions. These can also be used to modify an existing variable bound. For example:
julia> @variable(model, x >= 1)
x
julia> lower_bound(x)
1.0
julia> set_lower_bound(x, 2)
julia> lower_bound(x)
2.0We can delete variable bounds using delete_lower_bound and delete_upper_bound:
julia> @variable(model, 1 <= x <= 2)
x
julia> lower_bound(x)
1.0
julia> delete_lower_bound(x)
julia> has_lower_bound(x)
false
julia> upper_bound(x)
2.0
julia> delete_upper_bound(x)
julia> has_upper_bound(x)
falseIn addition to upper and lower bounds, JuMP variables can also be fixed to a value using fix. See also is_fixed, fix_value, and unfix.
julia> @variable(model, x == 1)
x
julia> is_fixed(x)
true
julia> fix_value(x)
1.0
julia> unfix(x)
julia> is_fixed(x)
falseFixing a variable with existing bounds will throw an error. To delete the bounds prior to fixing, use fix(variable, value; force = true).
julia> @variable(model, x >= 1)
x
julia> fix(x, 2)
ERROR: Unable to fix x to 2 because it has existing variable bounds. Consider calling `JuMP.fix(variable, value; force=true)` which will delete existing bounds before fixing the variable.
julia> fix(x, 2; force = true)
julia> fix_value(x)
2.0Variable names
The name, i.e. the value of the MOI.VariableName attribute, of a variable can be obtained by JuMP.name(::JuMP.VariableRef) and set by JuMP.set_name(::JuMP.VariableRef, ::String).
JuMP.name — Method.name(v::VariableRef)::StringGet a variable's name attribute.
JuMP.set_name — Method.set_name(v::VariableRef, s::AbstractString)Set a variable's name attribute.
The variable can also be retrieved from its name using JuMP.variable_by_name.
JuMP.variable_by_name — Function.variable_by_name(model::AbstractModel,
name::String)::Union{AbstractVariableRef, Nothing}Returns the reference of the variable with name attribute name or Nothing if no variable has this name attribute. Throws an error if several variables have name as their name attribute.
julia> model = Model()
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
julia> @variable(model, x)
x
julia> variable_by_name(model, "x")
x
julia> @variable(model, base_name="x")
x
julia> variable_by_name(model, "x")
ERROR: Multiple variables have the name x.
Stacktrace:
[1] error(::String) at ./error.jl:33
[2] get(::MOIU.Model{Float64}, ::Type{MathOptInterface.VariableIndex}, ::String) at /home/blegat/.julia/dev/MathOptInterface/src/Utilities/model.jl:222
[3] get at /home/blegat/.julia/dev/MathOptInterface/src/Utilities/universalfallback.jl:201 [inlined]
[4] get(::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer,MathOptInterface.Utilities.UniversalFallback{MOIU.Model{Float64}}}, ::Type{MathOptInterface.VariableIndex}, ::String) at /home/blegat/.julia/dev/MathOptInterface/src/Utilities/cachingoptimizer.jl:490
[5] variable_by_name(::Model, ::String) at /home/blegat/.julia/dev/JuMP/src/variables.jl:268
[6] top-level scope at none:0
julia> var = @variable(model, base_name="y")
y
julia> variable_by_name(model, "y")
y
julia> set_name(var, "z")
julia> variable_by_name(model, "y")
julia> variable_by_name(model, "z")
z
julia> @variable(model, u[1:2])
2-element Array{VariableRef,1}:
u[1]
u[2]
julia> variable_by_name(model, "u[2]")
u[2]Variable containers
In the examples above, we have mostly created scalar variables. By scalar, we mean that the Julia variable is bound to exactly one JuMP variable. However, it is often useful to create collections of JuMP variables inside more complicated data structures.
JuMP provides a mechanism for creating three types of these data structures, which we refer to as containers. The three types are Arrays, DenseAxisArrays, and SparseAxisArrays. We explain each of these in the following.
Arrays
We have already seen the creation of an array of JuMP variables with the x[1:2] syntax. This can naturally be extended to create multi-dimensional arrays of JuMP variables. For example:
julia> @variable(model, x[1:2, 1:2])
2×2 Array{VariableRef,2}:
x[1,1] x[1,2]
x[2,1] x[2,2]Arrays of JuMP variables can be indexed and sliced as follows:
julia> x[1, 2]
x[1,2]
julia> x[2, :]
2-element Array{VariableRef,1}:
x[2,1]
x[2,2]Variable bounds can depend upon the indices:
julia> @variable(model, x[i=1:2, j=1:2] >= 2i + j)
2×2 Array{VariableRef,2}:
x[1,1] x[1,2]
x[2,1] x[2,2]
julia> lower_bound.(x)
2×2 Array{Float64,2}:
3.0 4.0
5.0 6.0JuMP will form an Array of JuMP variables when it can determine at compile time that the indices are one-based integer ranges. Therefore x[1:b] will create an Array of JuMP variables, but x[a:b] will not. If JuMP cannot determine that the indices are one-based integer ranges (e.g., in the case of x[a:b]), JuMP will create a DenseAxisArray instead.
DenseAxisArrays
We often want to create arrays where the indices are not one-based integer ranges. For example, we may want to create a variable indexed by the name of a product or a location. The syntax is the same as that above, except with an arbitrary vector as an index as opposed to a one-based range. The biggest difference is that instead of returning an Array of JuMP variables, JuMP will return a DenseAxisArray. For example:
julia> @variable(model, x[1:2, [:A,:B]])
2-dimensional DenseAxisArray{VariableRef,2,...} with index sets:
Dimension 1, Base.OneTo(2)
Dimension 2, Symbol[:A, :B]
And data, a 2×2 Array{VariableRef,2}:
x[1,A] x[1,B]
x[2,A] x[2,B]DenseAxisArrays can be indexed and sliced as follows:
julia> x[1, :A]
x[1,A]
julia> x[2, :]
1-dimensional DenseAxisArray{VariableRef,1,...} with index sets:
Dimension 1, Symbol[:A, :B]
And data, a 2-element Array{VariableRef,1}:
x[2,A]
x[2,B]Similarly to the Array case, bounds can depend upon indices. For example:
julia> @variable(model, x[i=2:3, j=1:2:3] >= 0.5i + j)
2-dimensional DenseAxisArray{VariableRef,2,...} with index sets:
Dimension 1, 2:3
Dimension 2, 1:2:3
And data, a 2×2 Array{VariableRef,2}:
x[2,1] x[2,3]
x[3,1] x[3,3]
julia> lower_bound.(x)
2-dimensional DenseAxisArray{Float64,2,...} with index sets:
Dimension 1, 2:3
Dimension 2, 1:2:3
And data, a 2×2 Array{Float64,2}:
2.0 4.0
2.5 4.5SparseAxisArrays
The third container type that JuMP natively supports is SparseAxisArray. These arrays are created when the indices do not form a rectangular set. For example, this applies when indices have a dependence upon previous indices (called triangular indexing). JuMP supports this as follows:
julia> @variable(model, x[i=1:2, j=i:2])
JuMP.Containers.SparseAxisArray{VariableRef,2,Tuple{Int64,Int64}} with 3 entries:
[1, 2] = x[1,2]
[2, 2] = x[2,2]
[1, 1] = x[1,1]We can also conditionally create variables via a JuMP-specific syntax. This syntax appends a comparison check that depends upon the named indices and is separated from the indices by a semi-colon (;). For example:
julia> @variable(model, x[i=1:4; mod(i, 2)==0])
JuMP.Containers.SparseAxisArray{VariableRef,1,Tuple{Int64}} with 2 entries:
[4] = x[4]
[2] = x[2]Note that with many index dimensions and a large amount of sparsity, variable construction may be unnecessarily slow if the semi-colon syntax is naively applied. When using the semi-colon as a filter, JuMP iterates over all indices and evaluates the conditional for each combination. When this is undesired, the recommended work-around is to work directly with a list of tuples or create a dictionary. Consider the following examples:
N = 10
S = [(1, 1, 1),(N, N, N)]
# Slow. It evaluates conditional N^3 times.
@variable(model, x1[i=1:N, j=1:N, k=1:N; (i, j, k) in S])
# Fast.
@variable(model, x2[S])
# Fast. Manually constructs a dictionary and fills it.
x3 = Dict()
for (i, j, k) in S
x3[i, j, k] = @variable(model)
# Optional, if you care about pretty printing:
set_name(x3[i, j, k], "x[$i,$j,$k]")
endForcing the container type
When creating a container of JuMP variables, JuMP will attempt to choose the tightest container type that can store the JuMP variables. Thus, it will prefer to create an Array before a DenseAxisArray and a DenseAxisArray before a SparseAxisArray. However, because this happens at compile time, it does not always make the best choice. To illustrate this, consider the following example:
julia> A = 1:2
1:2
julia> @variable(model, x[A])
1-dimensional DenseAxisArray{VariableRef,1,...} with index sets:
Dimension 1, 1:2
And data, a 2-element Array{VariableRef,1}:
x[1]
x[2]Since the value (and type) of A is unknown at parsing time, JuMP is unable to infer that A is a one-based integer range. Therefore, JuMP creates a DenseAxisArray, even though it could store these two variables in a standard one-dimensional Array.
We can share our knowledge that it is possible to store these JuMP variables as an array by setting the container keyword:
julia> @variable(model, y[A], container=Array)
2-element Array{VariableRef,1}:
y[1]
y[2]JuMP now creates a vector of JuMP variables instead of a DenseAxisArray. Note that choosing an invalid container type will throw an error.
Integrality utilities
Adding integrality constraints to a model such as @constraint(model, x in MOI.ZeroOne()) and @constraint(model, x in MOI.Integer()) is a common operation. Therefore, JuMP supports two shortcuts for adding such constraints.
Binary (ZeroOne) constraints
Binary optimization variables are constrained to the set $x \in \{0, 1\}$. (The MOI.ZeroOne set in MathOptInterface.) Binary optimization variables can be created in JuMP by passing Bin as an optional positional argument:
julia> @variable(model, x, Bin)
xWe can check if an optimization variable is binary by calling is_binary on the JuMP variable, and binary constraints can be removed with unset_binary.
julia> is_binary(x)
true
julia> unset_binary(x)
julia> is_binary(x)
falseBinary optimization variables can also be created by setting the binary keyword to true.
julia> @variable(model, x, binary=true)
xInteger constraints
Integer optimization variables are constrained to the set $x \in \mathbb{Z}$. (The MOI.Integer set in MathOptInterface.) Integer optimization variables can be created in JuMP by passing Int as an optional positional argument:
julia> @variable(model, x, Int)
xInteger optimization variables can also be created by setting the integer keyword to true.
julia> @variable(model, x, integer=true)
xWe can check if an optimization variable is integer by calling is_integer on the JuMP variable, and integer constraints can be removed with unset_integer.
julia> is_integer(x)
true
julia> unset_integer(x)
julia> is_integer(x)
falseRelaxing integrality
The relax_integrality function relaxes all integrality constraints in the model, returning a function that can be called to undo the operation later on.
Semidefinite variables
JuMP also supports modeling with semidefinite variables. A square symmetric matrix $X$ is positive semidefinite if all eigenvalues are nonnegative. We can declare a matrix of JuMP variables to be positive semidefinite as follows:
julia> @variable(model, x[1:2, 1:2], PSD)
2×2 LinearAlgebra.Symmetric{VariableRef,Array{VariableRef,2}}:
x[1,1] x[1,2]
x[1,2] x[2,2]or using the syntax for Variables constrained on creation:
julia> @variable(model, x[1:2, 1:2] in PSDCone())
2×2 LinearAlgebra.Symmetric{VariableRef,Array{VariableRef,2}}:
x[1,1] x[1,2]
x[1,2] x[2,2]Note that x must be a square 2-dimensional Array of JuMP variables; it cannot be a DenseAxisArray or a SparseAxisArray. (See Variable containers, above, for more on this.)
You can also impose a weaker constraint that the square matrix is only symmetric (instead of positive semidefinite) as follows:
julia> @variable(model, x[1:2, 1:2], Symmetric)
2×2 LinearAlgebra.Symmetric{VariableRef,Array{VariableRef,2}}:
x[1,1] x[1,2]
x[1,2] x[2,2]Anonymous JuMP variables
In all of the above examples, we have created named JuMP variables. However, it is also possible to create so called anonymous JuMP variables. To create an anonymous JuMP variable, we drop the name of the variable from the macro call. This means dropping the second positional argument if the JuMP variable is a scalar, or dropping the name before the square bracket ([) if a container is being created. For example:
julia> x = @variable(model)
nonameThis shows how @variable(model, x) is really short for:
julia> x = model[:x] = @variable(model, base_name="x")
xAn Array of anonymous JuMP variables can be created as follows:
julia> y = @variable(model, [i=1:2])
2-element Array{VariableRef,1}:
noname
nonameIf necessary, you can store x in model as follows:
julia> model[:x] = xThe <= and >= short-hand cannot be used to set bounds on anonymous JuMP variables. Instead, you should use the lower_bound and upper_bound keywords.
Passing the Bin and Int variable types are also invalid. Instead, you should use the binary and integer keywords.
Thus, the anonymous variant of @variable(model, x[i=1:2] >= i, Int) is:
julia> x = @variable(model, [i=1:2], base_name="x", lower_bound=i, integer=true)
2-element Array{VariableRef,1}:
x[1]
x[2]Creating two named JuMP variables with the same name results in an error at runtime. Use anonymous variables as an alternative.
Variables constrained on creation
When using JuMP in Direct mode, it may be required to constrain variables on creation instead of constraining free variables as the solver may only support variables constrained on creation. In Automatic and Manual modes, both ways of adding constraints on variables are equivalent. Indeed, during the copy of the cache to the optimizer, the choice of the constraints on variables that are copied as variables constrained on creation does not depend on how it was added to the cache.
All uses of the @variable macro documented so far translate to a separate call for variable creation and adding of constraints.
For example, @variable(model, x >= 0, Int), is equivalent to:
@variable(model, x)
set_lower_bound(x, 0.0)
@constraint(model, x in MOI.Integer())Importantly, the bound and integrality constraints are added after the variable has been created.
However, some solvers require a constraining set at creation time. We say that these variables are constrained on creation.
Use in within @variable to access the special syntax for constraining variables on creation. For example, the following creates a vector of variables constrained on creation to belong to the SecondOrderCone:
julia> @variable(model, y[1:3] in SecondOrderCone())
3-element Array{VariableRef,1}:
y[1]
y[2]
y[3]For contrast, the more standard approach is as follows:
julia> @variable(model, x[1:3])
3-element Array{VariableRef,1}:
x[1]
x[2]
x[3]
julia> @constraint(model, x in SecondOrderCone())
[x[1], x[2], x[3]] ∈ MathOptInterface.SecondOrderCone(3)The technical difference between the former and the latter is that the former calls MOI.add_constrained_variables while the latter calls MOI.add_variables and then MOI.add_constraint. This distinction is important only in Direct mode, depending on the solver being used. It's often not possible to delete the SecondOrderCone constraint if it was specified at variable creation time.
The set keyword
An alternate syntax to x in Set is to use the set keyword of @variable. This is most useful when creating anonymous variables:
x = @variable(model, [1:2, 1:2], set = PSDCone())User-defined containers
In the section Variable containers, we explained how JuMP supports the efficient creation of collections of JuMP variables in three types of containers. However, users are also free to create collections of JuMP variables in their own datastructures. For example, the following code creates a dictionary with symmetric matrices as the values:
julia> variables = Dict{Symbol, Array{VariableRef,2}}()
Dict{Symbol,Array{VariableRef,2}} with 0 entries
julia> for key in [:A, :B]
global variables[key] = @variable(model, [1:2, 1:2])
end
julia> variables
Dict{Symbol,Array{VariableRef,2}} with 2 entries:
:A => VariableRef[noname noname; noname noname]
:B => VariableRef[noname noname; noname noname]Deleting variables
JuMP supports the deletion of optimization variables. To delete variables, we can use the delete method. We can also check whether x is a valid JuMP variable in model using the is_valid method:
julia> @variable(model, x)
x
julia> is_valid(model, x)
true
julia> delete(model, x)
julia> is_valid(model, x)
falseListing all variables
Use JuMP.all_variables to obtain a list of all variables present in the model. This is useful for performing operations like:
- relaxing all integrality constraints in the model
- setting the starting values for variables to the result of the last solve
Start values
There are two ways to provide a primal starting solution (also called MIP-start or a warmstart) for each variable:
- using the
startkeyword in the@variablemacro - using
set_start_value
The starting value of a variable can be queried using start_value. If no start value has been set, start_value will return nothing.
julia> @variable(model, x)
x
julia> start_value(x)
julia> @variable(model, y, start = 1)
y
julia> start_value(y)
1.0
julia> set_start_value(y, 2)
julia> start_value(y)
2.0Prior to JuMP 0.19, the previous solution to a solve was automatically set as the new starting value. JuMP 0.19 no longer does this automatically. To reproduce the functionality, use:
set_start_value.(all_variables(model), value.(all_variables(model)))The @variables macro
If you have many @variable calls, JuMP provides the macro @variables that can improve readability:
julia> @variables(model, begin
x
y[i=1:2] >= i, (start = i, base_name = "Y_$i")
z, Bin
end)
julia> print(model)
Feasibility
Subject to
Y_1[1] ≥ 1.0
Y_2[2] ≥ 2.0
z binaryKeyword arguments must be contained within parentheses. (See the example above.)
Reference
JuMP.@variable — Macro.@variable(model, kw_args...)Add an anonymous variable to the model model described by the keyword arguments kw_args and returns the variable.
@variable(model, expr, args..., kw_args...)Add a variable to the model model described by the expression expr, the positional arguments args and the keyword arguments kw_args. The expression expr can either be (note that in the following the symbol <= can be used instead of ≤ and the symbol >=can be used instead of ≥)
- of the form
varexprcreating variables described byvarexpr; - of the form
varexpr ≤ ub(resp.varexpr ≥ lb) creating variables described byvarexprwith upper bounds given byub(resp. lower bounds given bylb); - of the form
varexpr == valuecreating variables described byvarexprwith fixed values given byvalue; or - of the form
lb ≤ varexpr ≤ uborub ≥ varexpr ≥ lbcreating variables described byvarexprwith lower bounds given bylband upper bounds given byub. - of the form
varexpr in setcreating variables described byvarexprconstrained to belong toset, see Variables constrained on creation.
The expression varexpr can either be
- of the form
varnamecreating a scalar real variable of namevarname; - of the form
varname[...]or[...]creating a container of variables (see Containers in macros).
The recognized positional arguments in args are the following:
Bin: Sets the variable to be binary, i.e. either 0 or 1.Int: Sets the variable to be integer, i.e. one of ..., -2, -1, 0, 1, 2, ...Symmetric: Only available when creating a square matrix of variables, i.e. whenvarexpris of the formvarname[1:n,1:n]orvarname[i=1:n,j=1:n]. It creates a symmetric matrix of variable, that is, it only creates a new variable forvarname[i,j]withi ≤ jand setsvarname[j,i]to the same variable asvarname[i,j]. It is equivalent to usingvarexpr in SymMatrixSpace()asexpr.PSD: The square matrix of variable is bothSymmetricand constrained to be positive semidefinite. It is equivalent to usingvarexpr in PSDCone()asexpr.
The recognized keyword arguments in kw_args are the following:
base_name: Sets the name prefix used to generate variable names. It corresponds to the variable name for scalar variable, otherwise, the variable names are set tobase_name[...]for each index...of the axesaxes.lower_bound: Sets the value of the variable lower bound.upper_bound: Sets the value of the variable upper bound.start: Sets the variable starting value used as initial guess in optimization.binary: Sets whether the variable is binary or not.integer: Sets whether the variable is integer or not.variable_type: See the "Note for extending the variable macro" section below.set: Equivalent to usingvarexpr in valueasexprwherevalueis the value of the keyword argument.container: Specify the container type, see Containers in macros.
Examples
The following are equivalent ways of creating a variable x of name x with lower bound 0:
# Specify everything in `expr`
@variable(model, x >= 0)
# Specify the lower bound using a keyword argument
@variable(model, x, lower_bound=0)
# Specify everything in `kw_args`
x = @variable(model, base_name="x", lower_bound=0)The following are equivalent ways of creating a DenseAxisArray of index set [:a, :b] and with respective upper bounds 2 and 3 and names x[a] and x[b]. The upper bound can either be specified in expr:
ub = Dict(:a => 2, :b => 3)
@variable(model, x[i=keys(ub)] <= ub[i])
# output
1-dimensional DenseAxisArray{VariableRef,1,...} with index sets:
Dimension 1, Symbol[:a, :b]
And data, a 2-element Array{VariableRef,1}:
x[a]
x[b]or it can be specified with the upper_bound keyword argument:
@variable(model, y[i=keys(ub)], upper_bound=ub[i])
# output
1-dimensional DenseAxisArray{VariableRef,1,...} with index sets:
Dimension 1, Symbol[:a, :b]
And data, a 2-element Array{VariableRef,1}:
y[a]
y[b]Note for extending the variable macro
The single scalar variable or each scalar variable of the container are created using add_variable(model, build_variable(_error, info, extra_args...; extra_kw_args...)) where
modelis the model passed to the@variablemacro;_erroris an error function with a singleStringargument showing the@variablecall in addition to the error message given as argument;infois theVariableInfostruct containing the information gathered inexpr, the recognized keyword arguments (exceptbase_nameandvariable_type) and the recognized positional arguments (exceptSymmetricandPSD);extra_argsare the unrecognized positional arguments ofargsplus the value of thevariable_typekeyword argument if present. Thevariable_typekeyword argument allows the user to pass a position argument tobuild_variablewithout the need to give a positional argument to@variable. In particular, this allows the user to give a positional argument to thebuild_variablecall when using the anonymous single variable syntax@variable(model, kw_args...); andextra_kw_argsare the unrecognized keyword argument ofkw_args.
Examples
The following creates a variable x of name x with lower_bound 0 as with the first example above but does it without using the @variable macro
info = VariableInfo(true, 0, false, NaN, false, NaN, false, NaN, false, false)
JuMP.add_variable(model, JuMP.build_variable(error, info), "x")The following creates a DenseAxisArray of index set [:a, :b] and with respective upper bounds 2 and 3 and names x[a] and x[b] as with the second example above but does it without using the @variable macro
# Without the `@variable` macro
x = JuMP.Containers.container(i -> begin
info = VariableInfo(false, NaN, true, ub[i], false, NaN, false, NaN, false, false)
x[i] = JuMP.add_variable(model, JuMP.build_variable(error, info), "x[$i]")
end, JuMP.Containers.vectorized_product(keys(ub)))
# output
1-dimensional DenseAxisArray{VariableRef,1,...} with index sets:
Dimension 1, Symbol[:a, :b]
And data, a 2-element Array{VariableRef,1}:
x[a]
x[b]The following are equivalent ways of creating a Matrix of size N x N with variables custom variables created with a JuMP extension using the Poly(X) positional argument to specify its variables:
# Using the `@variable` macro
@variable(model, x[1:N,1:N], Symmetric, Poly(X))
# Without the `@variable` macro
x = Matrix{JuMP.variable_type(model, Poly(X))}(N, N)
info = VariableInfo(false, NaN, false, NaN, false, NaN, false, NaN, false, false)
for i in 1:N, j in i:N
x[i,j] = x[j,i] = JuMP.add_variable(model, build_variable(error, info, Poly(X)), "x[$i,$j]")
endJuMP.@variables — Macro.@variables(m, args...)Adds multiple variables to model at once, in the same fashion as @variable macro. The model must be the first argument, and multiple variables can be added on multiple lines wrapped in a begin ... end block. For example:
@variables(m, begin
x
y[i = 1:2] >= 0, (start = i)
z, Bin, (start = 0, base_name = "Z")
end)Keyword arguments must be contained within parentheses (refer to the example above).
JuMP.owner_model — Function.owner_model(s::AbstractJuMPScalar)Return the model owning the scalar s.
JuMP.VariableRef — Type.VariableRef <: AbstractVariableRefHolds a reference to the model and the corresponding MOI.VariableIndex.
JuMP.all_variables — Function.all_variables(model::Model)::Vector{VariableRef}Returns a list of all variables currently in the model. The variables are ordered by creation time.
Example
model = Model()
@variable(model, x)
@variable(model, y)
all_variables(model)
# output
2-element Array{VariableRef,1}:
x
yJuMP.num_variables — Function.num_variables(model::Model)::Int64Returns number of variables in model.
JuMP.has_lower_bound — Function.has_lower_bound(v::VariableRef)Return true if v has a lower bound. If true, the lower bound can be queried with lower_bound. See also LowerBoundRef.
JuMP.lower_bound — Function.lower_bound(v::VariableRef)Return the lower bound of a variable. Error if one does not exist. See also has_lower_bound.
JuMP.set_lower_bound — Function.set_lower_bound(v::VariableRef, lower::Number)Set the lower bound of a variable. If one does not exist, create a new lower bound constraint. See also delete_lower_bound.
JuMP.delete_lower_bound — Function.delete_lower_bound(v::VariableRef)Delete the lower bound constraint of a variable.
JuMP.has_upper_bound — Function.has_upper_bound(v::VariableRef)Return true if v has a upper bound. If true, the upper bound can be queried with upper_bound. See also UpperBoundRef.
JuMP.upper_bound — Function.upper_bound(v::VariableRef)Return the upper bound of a variable. Error if one does not exist. See also has_upper_bound.
JuMP.set_upper_bound — Function.set_upper_bound(v::VariableRef,upper::Number)Set the upper bound of a variable. If one does not exist, create an upper bound constraint. See also delete_upper_bound.
JuMP.delete_upper_bound — Function.delete_upper_bound(v::VariableRef)Delete the upper bound constraint of a variable.
JuMP.is_fixed — Function.JuMP.fix_value — Function.fix_value(v::VariableRef)Return the value to which a variable is fixed. Error if one does not exist. See also is_fixed.
JuMP.fix — Function.fix(v::VariableRef, value::Number; force::Bool = false)Fix a variable to a value. Update the fixing constraint if one exists, otherwise create a new one. See also unfix.
If the variable already has variable bounds and force=false, calling fix will throw an error. If force=true, existing variable bounds will be deleted, and the fixing constraint will be added. Note a variable will have no bounds after a call to unfix.
JuMP.unfix — Function.unfix(v::VariableRef)Delete the fixing constraint of a variable.
JuMP.is_integer — Function.is_integer(v::VariableRef)Return true if v is constrained to be integer. See also IntegerRef.
JuMP.set_integer — Function.set_integer(variable_ref::VariableRef)Add an integrality constraint on the variable variable_ref. See also unset_integer.
JuMP.unset_integer — Function.unset_integer(variable_ref::VariableRef)Remove the integrality constraint on the variable variable_ref.
JuMP.IntegerRef — Function.IntegerRef(v::VariableRef)Return a constraint reference to the constraint constrainting v to be integer. Errors if one does not exist.
JuMP.is_binary — Function.is_binary(v::VariableRef)Return true if v is constrained to be binary. See also BinaryRef.
JuMP.set_binary — Function.set_binary(v::VariableRef)Add a constraint on the variable v that it must take values in the set $\{0,1\}$. See also unset_binary.
JuMP.unset_binary — Function.unset_binary(variable_ref::VariableRef)Remove the binary constraint on the variable variable_ref.
JuMP.BinaryRef — Function.BinaryRef(v::VariableRef)Return a constraint reference to the constraint constrainting v to be binary. Errors if one does not exist.
JuMP.relax_integrality — Function.relax_integrality(model::Model)Modifies model to "relax" all binary and integrality constraints on variables. Specifically,
- Binary constraints are deleted, and variable bounds are tightened if necessary to ensure the variable is constrained to the interval $[0, 1]$.
- Integrality constraints are deleted without modifying variable bounds.
- An error is thrown if semi-continuous or semi-integer constraints are present (support may be added for these in the future).
- All other constraints are ignored (left in place). This includes discrete constraints like SOS and indicator constraints.
Returns a function that can be called without any arguments to restore the original model. The behavior of this function is undefined if additional changes are made to the affected variables in the meantime.
Example
julia> model = Model();
julia> @variable(model, x, Bin);
julia> @variable(model, 1 <= y <= 10, Int);
julia> @objective(model, Min, x + y);
julia> undo_relax = relax_integrality(model);
julia> print(model)
Min x + y
Subject to
x ≥ 0.0
y ≥ 1.0
x ≤ 1.0
y ≤ 10.0
julia> undo_relax()
julia> print(model)
Min x + y
Subject to
y ≥ 1.0
y ≤ 10.0
y integer
x binaryJuMP.index — Method.index(v::VariableRef)::MOI.VariableIndexReturn the index of the variable that corresponds to v in the MOI backend.
JuMP.optimizer_index — Method.optimizer_index(v::VariableRef)::MOI.VariableIndexReturn the index of the variable that corresponds to v in the optimizer model. It throws NoOptimizer if no optimizer is set and throws an ErrorException if the optimizer is set but is not attached.
JuMP.set_start_value — Function.set_start_value(variable::VariableRef, value::Number)Set the start value (MOI attribute VariablePrimalStart) of the variable v to value. See also start_value.
Note: VariablePrimalStarts are sometimes called "MIP-starts" or "warmstarts".
JuMP.start_value — Function.start_value(v::VariableRef)Return the start value (MOI attribute VariablePrimalStart) of the variable v. See also set_start_value.
Note: VariablePrimalStarts are sometimes called "MIP-starts" or "warmstarts".
JuMP.reduced_cost — Function.reduced_cost(x::VariableRef)::Float64Return the reduced cost associated with variable x.
Equivalent to querying the shadow price of the active variable bound (if one exists and is active).
See also: shadow_price.