# Reference

DiffOpt.BackwardInVariablePrimalType
BackwardInVariablePrimal <: MOI.AbstractVariableAttribute

A MOI.AbstractVariableAttribute to set input data to backward differentiation, that is, problem solution.

For instance, to set the tangent of the variable of index vi, do the following:

MOI.set(model, DiffOpt.BackwardInVariablePrimal(), x)
source
DiffOpt.BackwardOutConstraintType
BackwardOutConstraint

An MOI.AbstractConstraintAttribute to get output data to backward differentiation, that is, problem input data.

For instance, if the following returns x + 2y + 5, it means that the tangent has coordinate 1 for the coefficient of x, coordinate 2 for the coefficient of y and 5 for the function constant. If the constraint is of the form func == constant or func <= constant, the tangent for the constant on the right-hand side is -5.

MOI.get(model, DiffOpt.BackwardOutConstraint(), ci)
source
DiffOpt.BackwardOutObjectiveType
BackwardOutObjective <: MOI.AbstractModelAttribute

A MOI.AbstractModelAttribute to get output data to backward differentiation, that is, problem input data.

For instance, to get the tangent of the objective function corresponding to the tangent given to BackwardInVariablePrimal, do the following:

func = MOI.get(model, DiffOpt.BackwardOutObjective)

Then, to get the sensitivity of the linear term with variable x, do

JuMP.coefficient(func, x)

To get the sensitivity with respect to the quadratic term with variables x and y, do either

JuMP.coefficient(func, x, y)

or

DiffOpt.quad_sym_half(func, x, y)
Warning

These two lines are not equivalent in case x == y, see quad_sym_half for the details on the difference between these two functions.

source
DiffOpt.ForwardInConstraintType
ForwardInConstraint <: MOI.AbstractConstraintAttribute

A MOI.AbstractConstraintAttribute to set input data to forward differentiation, that is, problem input data.

For instance, if the scalar constraint of index ci contains θ * (x + 2y) <= 5θ, for the purpose of computing the derivative with respect to θ, the following should be set:

fx = MOI.SingleVariable(x)
fy = MOI.SingleVariable(y)
MOI.set(model, DiffOpt.ForwardInConstraint(), ci, 1.0 * fx + 2.0 * fy - 5.0)

Note that we use -5 as the ForwardInConstraint sets the tangent of the ConstraintFunction so we consider the expression θ * (x + 2y - 5).

source
DiffOpt.ForwardInObjectiveType
ForwardInObjective <: MOI.AbstractModelAttribute

A MOI.AbstractModelAttribute to set input data to forward differentiation, that is, problem input data. The possible values are any MOI.AbstractScalarFunction. A MOI.ScalarQuadraticFunction can only be used in linearly constrained quadratic models.

For instance, if the objective contains θ * (x + 2y), for the purpose of computinig the derivative with respect to θ, the following should be set:

fx = MOI.SingleVariable(x)
fy = MOI.SingleVariable(y)
MOI.set(model, DiffOpt.ForwardInObjective(), 1.0 * fx + 2.0 * fy)

where x and y are the relevant MOI.VariableIndex.

source
DiffOpt.ForwardOutVariablePrimalType
ForwardOutVariablePrimal <: MOI.AbstractVariableAttribute

A MOI.AbstractVariableAttribute to get output data from forward differentiation, that is, problem solution.

For instance, to get the tangent of the variable of index vi corresponding to the tangents given to ForwardInObjective and ForwardInConstraint, do the following:

MOI.get(model, DiffOpt.ForwardOutVariablePrimal(), vi)
source
DiffOpt.IndexMappedFunctionType
IndexMappedFunction{F<:MOI.AbstractFunction} <: AbstractLazyScalarFunction

Lazily represents the function MOI.Utilities.map_indices(index_map, DiffOpt.standard_form(func)).

source
DiffOpt.MOItoJuMPType
MOItoJuMP{F<:MOI.AbstractScalarFunction} <: JuMP.AbstractJuMPScalar

Lazily represents the function JuMP.jump_function(model, DiffOpt.standard_form(func)).

source
DiffOpt.MatrixScalarQuadraticFunctionType
struct MatrixScalarQuadraticFunction{T, VT, MT} <: MOI.AbstractScalarFunction
affine::VectorScalarAffineFunction{T,VT}
terms::MT
end

Represents the function x' * terms * x / 2 + affine as an MOI.AbstractScalarFunction where x[i] = MOI.VariableIndex(i). Use standard_form to convert it to a MOI.ScalarQuadraticFunction{T}.

source
DiffOpt.ProgramClassType
ProgramClass <: MOI.AbstractOptimizerAttribute

Determines which program class to used from ProgramClassCode. The default is AUTOMATIC.

One important advantage of setting the class explicitly is that it will allow necessary bridges to be used. If the class is AUTOMATIC then DiffOpt.Optimizer will report that it supports both objective and constraints of the QP and CP classes. For instance, it will reports that is supports both quadratic objective and conic constraints. However, at the differentiation stage, we won't be able to differentiate since QP does not support conic constraints and CP does not support quadratic objective. On the other hand, if the ProgramClass is set to CONIC then DiffOpt.Optimizer will report that it does not support quadratic objective hence it will be bridged to second-order cone constraints and we will be able to use CP to differentiate.

source
DiffOpt.ProgramClassCodeType
@enum ProgramClassCode QUADRATIC CONIC AUTOMATIC

Program class used by DiffOpt. DiffOpt implements differentiation of two different program class:

2. Conic Program (CP): linear objective and conic constraints.

AUTOMATIC which means that the class will be automatically selected given the problem data: if any constraint is conic, CP is used and QP is used otherwise. See ProgramClass.

source
DiffOpt.DπMethod
Dπ(v::Vector{Float64}, model, conic_form::MatOI.GeometricConicForm, index_map::MOIU.IndexMap)

Given a model, its conic_form and the index_map from the indices of model to the indices of conic_form, find the gradient of the projection of the vectors v of length equal to the number of rows in the conic form onto the cartesian product of the cones corresponding to these rows. For more info, refer to https://github.com/matbesancon/MathOptSetDistances.jl

source
DiffOpt._backward_conicMethod
_backward_conic(model::Optimizer, dx::Vector{Float64}, dy::Vector{Float64}, ds::Vector{Float64})

Method to compute the product of the transpose of the derivative (Jacobian) at the conic program parameters A, b, c to the perturbations dx, dy, ds. This is similar to backward.

For theoretical background, refer Section 3 of Differentiating Through a Cone Program, https://arxiv.org/abs/1904.09043

source
DiffOpt._backward_quadMethod
_backward_quad(model::Optimizer)

Method to differentiate optimal solution z and return product of jacobian matrices (dz / dQ, dz / dq, etc) with the backward pass vector dl / dz

The method computes the product of

1. jacobian of problem solution z* with respect to problem parameters set with the BackwardInVariablePrimal
2. a backward pass vector dl / dz, where l can be a loss function

Note that this method does not returns the actual jacobians.

source
DiffOpt._forward_conicMethod
_forward_conic(model::Optimizer)

Method to compute the product of the derivative (Jacobian) at the conic program parameters A, b, c to the perturbations dA, db, dc. This is similar to forward.

For theoretical background, refer Section 3 of Differentiating Through a Cone Program, https://arxiv.org/abs/1904.09043

source
DiffOpt.create_LHS_matrixFunction
create_LHS_matrix(z, λ, Q, G, h, A=nothing)

Inverse matrix specified on RHS of eqn(7) in https://arxiv.org/pdf/1703.00443.pdf

Helper method while calling _backward_quad

source
DiffOpt.diff_optimizerMethod
diff_optimizer(optimizer_constructor)::Optimizer

Creates a DiffOpt.Optimizer, which is an MOI layer with an internal optimizer and other utility methods. Results (primal, dual and slack values) are obtained by querying the internal optimizer instantiated using the optimizer_constructor. These values are required for find jacobians with respect to problem data.

One define a differentiable model by using any solver of choice. Example:

julia> using DiffOpt, GLPK

julia> model = diff_optimizer(GLPK.Optimizer)

julia> _backward_quad(model)  # for convex conic models
source
DiffOpt.get_problem_dataMethod
get_problem_data(model::MOI.AbstractOptimizer)

Return problem parameters as matrices along with other program info such as number of constraints, variables, etc

source
DiffOpt.map_rowsMethod
map_rows(f::Function, model, conic_form::MatOI.GeometricConicForm, index_map::MOIU.IndexMap, map_mode::Union{Nested{T}, Flattened{T}})

Given a model, its conic_form, the index_map from the indices of model to the indices of conic_form and map_mode of type Nested (resp. Flattened), return a Vector{T} of length equal to the number of cones (resp. rows) in the conic form where the value for the index (resp. rows) corresponding to each cone is equal to f(ci, r) where ci is the corresponding constraint index in model and r is a UnitRange of the corresponding rows in the conic form.

source
DiffOpt.quad_sym_halfFunction
quad_sym_half(func, vi1::MOI.VariableIndex, vi2::MOI.VariableIndex)

Return Q[i,j] = Q[j,i] where the quadratic terms of func is represented by x' Q x / 2 for a symmetric matrix Q where x[i] = vi1 and x[j] = vi2. Note that while this is equal to JuMP.coefficient(func, vi1, vi2) if vi1 != vi2, in the case vi1 == vi2, it is rather equal to 2JuMP.coefficient(func, vi1, vi2).

source
DiffOpt.standard_formFunction
standard_form(func::AbstractLazyScalarFunction)

Converts func to a standard MOI scalar function.

standard_form(func::MOItoJuMP)

Converts func to a standard JuMP scalar function.

source
DiffOpt.πMethod
π(v::Vector{Float64}, model::MOI.ModelLike, conic_form::MatOI.GeometricConicForm, index_map::MOIU.IndexMap)

Given a model, its conic_form and the index_map from the indices of model to the indices of conic_form, find the projection of the vectors v of length equal to the number of rows in the conic form onto the cartesian product of the cones corresponding to these rows. For more info, refer to https://github.com/matbesancon/MathOptSetDistances.jl

source