# Solutions

More information can be found in the Solutions section of the manual.

## Basic utilities

`JuMP.optimize!`

— Function```
optimize!(
model::Model;
ignore_optimize_hook = (model.optimize_hook === nothing),
_differentiation_backend::MOI.Nonlinear.AbstractAutomaticDifferentiation =
MOI.Nonlinear.SparseReverseMode(),
kwargs...,
)
```

Optimize the model.

If an optimizer has not been set yet (see `set_optimizer`

), a `NoOptimizer`

error is thrown.

If `ignore_optimize_hook == true`

, the optimize hook is ignored and the model is solved as if the hook was not set. Keyword arguments `kwargs`

are passed to the `optimize_hook`

. An error is thrown if `optimize_hook`

is `nothing`

and keyword arguments are provided.

**Experimental features**

These features may change or be removed in any future version of JuMP.

Pass `_differentiation_backend`

to set the `MOI.Nonlinear.AbstractAutomaticDifferentiation`

backend used to compute derivatives of nonlinear programs.

If you require only `:ExprGraph`

, it is more efficient to pass `_differentiation_backend = MOI.Nonlinear.ExprGraphOnly()`

.

`JuMP.NoOptimizer`

— Type`struct NoOptimizer <: Exception end`

No optimizer is set. The optimizer can be provided to the `Model`

constructor or by calling `set_optimizer`

.

`JuMP.OptimizeNotCalled`

— Type`struct OptimizeNotCalled <: Exception end`

A result attribute cannot be queried before `optimize!`

is called.

`JuMP.solution_summary`

— Function`solution_summary(model::Model; result::Int = 1, verbose::Bool = false)`

Return a struct that can be used print a summary of the solution in result `result`

.

If `verbose=true`

, write out the primal solution for every variable and the dual solution for every constraint, excluding those with empty names.

**Examples**

When called at the REPL, the summary is automatically printed:

```
julia> solution_summary(model)
[...]
```

Use `print`

to force the printing of the summary from inside a function:

```
function foo(model)
print(solution_summary(model))
return
end
```

## Termination status

`JuMP.termination_status`

— Function`termination_status(model::Model)`

Return a `MOI.TerminationStatusCode`

describing why the solver stopped (i.e., the `MOI.TerminationStatus`

attribute).

`JuMP.raw_status`

— Function`raw_status(model::Model)`

Return the reason why the solver stopped in its own words (i.e., the MathOptInterface model attribute `RawStatusString`

).

`JuMP.result_count`

— Function`result_count(model::Model)`

Return the number of results available to query after a call to `optimize!`

.

## Primal solutions

`JuMP.primal_status`

— Function`primal_status(model::Model; result::Int = 1)`

Return a `MOI.ResultStatusCode`

describing the status of the most recent primal solution of the solver (i.e., the `MOI.PrimalStatus`

attribute) associated with the result index `result`

.

See also: `result_count`

.

`JuMP.has_values`

— Function`has_values(model::Model; result::Int = 1)`

Return `true`

if the solver has a primal solution in result index `result`

available to query, otherwise return `false`

.

See also `value`

and `result_count`

.

`JuMP.value`

— Function`value(con_ref::ConstraintRef; result::Int = 1)`

Return the primal value of constraint `con_ref`

associated with result index `result`

of the most-recent solution returned by the solver.

That is, if `con_ref`

is the reference of a constraint `func`

-in-`set`

, it returns the value of `func`

evaluated at the value of the variables (given by `value(::VariableRef)`

).

Use `has_values`

to check if a result exists before asking for values.

See also: `result_count`

.

**Note**

For scalar constraints, the constant is moved to the `set`

so it is not taken into account in the primal value of the constraint. For instance, the constraint `@constraint(model, 2x + 3y + 1 == 5)`

is transformed into `2x + 3y`

-in-`MOI.EqualTo(4)`

so the value returned by this function is the evaluation of `2x + 3y`

. ```

`value(var_value::Function, con_ref::ConstraintRef)`

Evaluate the primal value of the constraint `con_ref`

using `var_value(v)`

as the value for each variable `v`

.

`value(v::VariableRef; result = 1)`

Return the value of variable `v`

associated with result index `result`

of the most-recent returned by the solver.

Use `has_values`

to check if a result exists before asking for values.

See also: `result_count`

.

`value(var_value::Function, v::VariableRef)`

Evaluate the value of the variable `v`

as `var_value(v)`

.

`value(var_value::Function, ex::GenericAffExpr)`

Evaluate `ex`

using `var_value(v)`

as the value for each variable `v`

.

`value(v::GenericAffExpr; result::Int = 1)`

Return the value of the `GenericAffExpr`

`v`

associated with result index `result`

of the most-recent solution returned by the solver.

See also: `result_count`

.

`value(var_value::Function, ex::GenericQuadExpr)`

Evaluate `ex`

using `var_value(v)`

as the value for each variable `v`

.

`value(v::GenericQuadExpr; result::Int = 1)`

Return the value of the `GenericQuadExpr`

`v`

associated with result index `result`

of the most-recent solution returned by the solver.

Replaces `getvalue`

for most use cases.

See also: `result_count`

.

`value(p::NonlinearParameter)`

Return the current value stored in the nonlinear parameter `p`

.

**Example**

```
model = Model()
@NLparameter(model, p == 10)
value(p)
# output
10.0
```

`value(ex::NonlinearExpression; result::Int = 1)`

Return the value of the `NonlinearExpression`

`ex`

associated with result index `result`

of the most-recent solution returned by the solver.

Replaces `getvalue`

for most use cases.

See also: `result_count`

.

`value(var_value::Function, ex::NonlinearExpression)`

Evaluate `ex`

using `var_value(v)`

as the value for each variable `v`

.

`value(c::NonlinearConstraintRef; result::Int = 1)`

Return the value of the `NonlinearConstraintRef`

`c`

associated with result index `result`

of the most-recent solution returned by the solver.

See also: `result_count`

.

`value(var_value::Function, c::NonlinearConstraintRef)`

Evaluate `c`

using `var_value(v)`

as the value for each variable `v`

.

## Dual solutions

`JuMP.dual_status`

— Function`dual_status(model::Model; result::Int = 1)`

Return a `MOI.ResultStatusCode`

describing the status of the most recent dual solution of the solver (i.e., the `MOI.DualStatus`

attribute) associated with the result index `result`

.

See also: `result_count`

.

`JuMP.has_duals`

— Function`has_duals(model::Model; result::Int = 1)`

Return `true`

if the solver has a dual solution in result index `result`

available to query, otherwise return `false`

.

See also `dual`

, `shadow_price`

, and `result_count`

.

`JuMP.dual`

— Function`dual(con_ref::ConstraintRef; result::Int = 1)`

Return the dual value of constraint `con_ref`

associated with result index `result`

of the most-recent solution returned by the solver.

Use `has_dual`

to check if a result exists before asking for values.

See also: `result_count`

, `shadow_price`

.

`dual(c::NonlinearConstraintRef)`

Return the dual of the nonlinear constraint `c`

.

`JuMP.shadow_price`

— Function`shadow_price(con_ref::ConstraintRef)`

Return the change in the objective from an infinitesimal relaxation of the constraint.

This value is computed from `dual`

and can be queried only when `has_duals`

is `true`

and the objective sense is `MIN_SENSE`

or `MAX_SENSE`

(not `FEASIBILITY_SENSE`

). For linear constraints, the shadow prices differ at most in sign from the `dual`

value depending on the objective sense.

See also `reduced_cost`

.

**Notes**

- The function simply translates signs from
`dual`

and does not validate the conditions needed to guarantee the sensitivity interpretation of the shadow price. The caller is responsible, e.g., for checking whether the solver converged to an optimal primal-dual pair or a proof of infeasibility. - The computation is based on the current objective sense of the model. If this has changed since the last solve, the results will be incorrect.
- Relaxation of equality constraints (and hence the shadow price) is defined based on which sense of the equality constraint is active.

`JuMP.reduced_cost`

— Function`reduced_cost(x::VariableRef)::Float64`

Return the reduced cost associated with variable `x`

.

Equivalent to querying the shadow price of the active variable bound (if one exists and is active).

See also: `shadow_price`

.

## Basic attributes

`JuMP.objective_value`

— Function`objective_value(model::Model; result::Int = 1)`

Return the objective value associated with result index `result`

of the most-recent solution returned by the solver.

For scalar-valued objectives, this function returns a `Float64`

. For vector-valued objectives, it returns a `Vector{Float64}`

.

See also: `result_count`

.

`JuMP.objective_bound`

— Function`objective_bound(model::Model)`

Return the best known bound on the optimal objective value after a call to `optimize!(model)`

.

For scalar-valued objectives, this function returns a `Float64`

. For vector-valued objectives, it returns a `Vector{Float64}`

.

In the case of a vector-valued objective, this returns the *ideal point*, that is, the point obtained if each objective was optimized independently.

`JuMP.dual_objective_value`

— Function`dual_objective_value(model::Model; result::Int = 1)`

Return the value of the objective of the dual problem associated with result index `result`

of the most-recent solution returned by the solver.

Throws `MOI.UnsupportedAttribute{MOI.DualObjectiveValue}`

if the solver does not support this attribute.

See also: `result_count`

.

`JuMP.solve_time`

— Function`solve_time(model::Model)`

If available, returns the solve time reported by the solver. Returns "ArgumentError: ModelLike of type `Solver.Optimizer`

does not support accessing the attribute MathOptInterface.SolveTimeSec()" if the attribute is not implemented.

`JuMP.relative_gap`

— Function`relative_gap(model::Model)`

Return the final relative optimality gap after a call to `optimize!(model)`

. Exact value depends upon implementation of MathOptInterface.RelativeGap() by the particular solver used for optimization.

`JuMP.simplex_iterations`

— Function`simplex_iterations(model::Model)`

Gets the cumulative number of simplex iterations during the most-recent optimization.

Solvers must implement `MOI.SimplexIterations()`

to use this function.

`JuMP.barrier_iterations`

— Function`barrier_iterations(model::Model)`

Gets the cumulative number of barrier iterations during the most recent optimization.

Solvers must implement `MOI.BarrierIterations()`

to use this function.

`JuMP.node_count`

— Function`node_count(model::Model)`

Gets the total number of branch-and-bound nodes explored during the most recent optimization in a Mixed Integer Program.

Solvers must implement `MOI.NodeCount()`

to use this function.

## Conflicts

`JuMP.compute_conflict!`

— Function`compute_conflict!(model::Model)`

Compute a conflict if the model is infeasible. If an optimizer has not been set yet (see `set_optimizer`

), a `NoOptimizer`

error is thrown.

The status of the conflict can be checked with the `MOI.ConflictStatus`

model attribute. Then, the status for each constraint can be queried with the `MOI.ConstraintConflictStatus`

attribute.

`JuMP.copy_conflict`

— Function`copy_conflict(model::Model)`

Return a copy of the current conflict for the model `model`

and a `ReferenceMap`

that can be used to obtain the variable and constraint reference of the new model corresponding to a given `model`

's reference.

This is a convenience function that provides a filtering function for `copy_model`

.

**Note**

Model copy is not supported in `DIRECT`

mode, i.e. when a model is constructed using the `direct_model`

constructor instead of the `Model`

constructor. Moreover, independently on whether an optimizer was provided at model construction, the new model will have no optimizer, i.e., an optimizer will have to be provided to the new model in the `optimize!`

call.

**Examples**

In the following example, a model `model`

is constructed with a variable `x`

and two constraints `cref`

and `cref2`

. This model has no solution, as the two constraints are mutually exclusive. The solver is asked to compute a conflict with `compute_conflict!`

. The parts of `model`

participating in the conflict are then copied into a model `new_model`

.

```
model = Model() # You must use a solver that supports conflict refining/IIS
# computation, like CPLEX or Gurobi
@variable(model, x)
@constraint(model, cref, x >= 2)
@constraint(model, cref2, x <= 1)
compute_conflict!(model)
if MOI.get(model, MOI.ConflictStatus()) != MOI.CONFLICT_FOUND
error("No conflict could be found for an infeasible model.")
end
new_model, reference_map = copy_conflict(model)
```

## Sensitivity

`JuMP.lp_sensitivity_report`

— Function`lp_sensitivity_report(model::Model; atol::Float64 = 1e-8)::SensitivityReport`

Given a linear program `model`

with a current optimal basis, return a `SensitivityReport`

object, which maps:

- Every variable reference to a tuple
`(d_lo, d_hi)::Tuple{Float64,Float64}`

, explaining how much the objective coefficient of the corresponding variable can change by, such that the original basis remains optimal. - Every constraint reference to a tuple
`(d_lo, d_hi)::Tuple{Float64,Float64}`

, explaining how much the right-hand side of the corresponding constraint can change by, such that the basis remains optimal.

Both tuples are relative, rather than absolute. So given a objective coefficient of `1.0`

and a tuple `(-0.5, 0.5)`

, the objective coefficient can range between `1.0 - 0.5`

an `1.0 + 0.5`

.

`atol`

is the primal/dual optimality tolerance, and should match the tolerance of the solver used to compute the basis.

Note: interval constraints are NOT supported.

**Example**

```
model = Model(HiGHS.Optimizer)
@variable(model, -1 <= x <= 2)
@objective(model, Min, x)
optimize!(model)
report = lp_sensitivity_report(model; atol = 1e-7)
dx_lo, dx_hi = report[x]
println(
"The objective coefficient of `x` can decrease by $dx_lo or " *
"increase by $dx_hi."
)
c = LowerBoundRef(x)
dRHS_lo, dRHS_hi = report[c]
println(
"The lower bound of `x` can decrease by $dRHS_lo or increase " *
"by $dRHS_hi."
)
```

`JuMP.SensitivityReport`

— Type`SensitivityReport`

## Feasibility

`JuMP.primal_feasibility_report`

— Function```
primal_feasibility_report(
model::Model,
point::AbstractDict{VariableRef,Float64} = _last_primal_solution(model),
atol::Float64 = 0.0,
skip_missing::Bool = false,
)::Dict{Any,Float64}
```

Given a dictionary `point`

, which maps variables to primal values, return a dictionary whose keys are the constraints with an infeasibility greater than the supplied tolerance `atol`

. The value corresponding to each key is the respective infeasibility. Infeasibility is defined as the distance between the primal value of the constraint (see `MOI.ConstraintPrimal`

) and the nearest point by Euclidean distance in the corresponding set.

**Notes**

- If
`skip_missing = true`

, constraints containing variables that are not in`point`

will be ignored. - If
`skip_missing = false`

and a partial primal solution is provided, an error will be thrown. - If no point is provided, the primal solution from the last time the model was solved is used.

**Examples**

```
julia> model = Model();
julia> @variable(model, 0.5 <= x <= 1);
julia> primal_feasibility_report(model, Dict(x => 0.2))
Dict{Any,Float64} with 1 entry:
x ≥ 0.5 => 0.3
```

```
primal_feasibility_report(
point::Function,
model::Model;
atol::Float64 = 0.0,
skip_missing::Bool = false,
)
```

A form of `primal_feasibility_report`

where a function is passed as the first argument instead of a dictionary as the second argument.

**Examples**

```
julia> model = Model();
julia> @variable(model, 0.5 <= x <= 1);
julia> primal_feasibility_report(model) do v
return value(v)
end
Dict{Any,Float64} with 1 entry:
x ≥ 0.5 => 0.3
```