# Reference

`DiffOpt.AbstractLazyScalarFunction`

`DiffOpt.AbstractModel`

`DiffOpt.ConicProgram.Model`

`DiffOpt.ForwardConstraintFunction`

`DiffOpt.ForwardObjectiveFunction`

`DiffOpt.ForwardVariablePrimal`

`DiffOpt.IndexMappedFunction`

`DiffOpt.MOItoJuMP`

`DiffOpt.MatrixScalarQuadraticFunction`

`DiffOpt.MatrixVectorAffineFunction`

`DiffOpt.ModelConstructor`

`DiffOpt.ProductOfSets`

`DiffOpt.QuadraticProgram.Model`

`DiffOpt.ReverseConstraintFunction`

`DiffOpt.ReverseObjectiveFunction`

`DiffOpt.ReverseVariablePrimal`

`DiffOpt.VectorScalarAffineFunction`

`DiffOpt.Dπ`

`DiffOpt.QuadraticProgram.create_LHS_matrix`

`DiffOpt.add_all_model_constructors`

`DiffOpt.add_model_constructor`

`DiffOpt.diff_optimizer`

`DiffOpt.diff_optimizer`

`DiffOpt.forward_differentiate!`

`DiffOpt.map_rows`

`DiffOpt.quad_sym_half`

`DiffOpt.reverse_differentiate!`

`DiffOpt.standard_form`

`DiffOpt.π`

`DiffOpt.AbstractLazyScalarFunction`

— Type`abstract type AbstractLazyScalarFunction <: MOI.AbstractScalarFunction end`

Subtype of `MOI.AbstractScalarFunction`

that is not a standard MOI scalar function but can be converted to one using `standard_form`

.

The function can also be inspected lazily using `JuMP.coefficient`

or `quad_sym_half`

.

`DiffOpt.AbstractModel`

— Type`abstract type AbstractModel <: MOI.ModelLike end`

Model supporting `forward_differentiate!`

and `reverse_differentiate!`

.

`DiffOpt.ForwardConstraintFunction`

— Type`ForwardConstraintFunction <: MOI.AbstractConstraintAttribute`

A `MOI.AbstractConstraintAttribute`

to set input data to forward differentiation, that is, problem input data.

For instance, if the scalar constraint of index `ci`

contains `θ * (x + 2y) <= 5θ`

, for the purpose of computing the derivative with respect to `θ`

, the following should be set:

`MOI.set(model, DiffOpt.ForwardConstraintFunction(), ci, 1.0 * x + 2.0 * y - 5.0)`

Note that we use `-5`

as the `ForwardConstraintFunction`

sets the tangent of the ConstraintFunction so we consider the expression `θ * (x + 2y - 5)`

.

`DiffOpt.ForwardObjectiveFunction`

— Type`ForwardObjectiveFunction <: MOI.AbstractModelAttribute`

A `MOI.AbstractModelAttribute`

to set input data to forward differentiation, that is, problem input data. The possible values are any `MOI.AbstractScalarFunction`

. A `MOI.ScalarQuadraticFunction`

can only be used in linearly constrained quadratic models.

For instance, if the objective contains `θ * (x + 2y)`

, for the purpose of computing the derivative with respect to `θ`

, the following should be set:

`MOI.set(model, DiffOpt.ForwardObjectiveFunction(), 1.0 * x + 2.0 * y)`

where `x`

and `y`

are the relevant `MOI.VariableIndex`

.

`DiffOpt.ForwardVariablePrimal`

— Type`ForwardVariablePrimal <: MOI.AbstractVariableAttribute`

A `MOI.AbstractVariableAttribute`

to get output data from forward differentiation, that is, problem solution.

For instance, to get the tangent of the variable of index `vi`

corresponding to the tangents given to `ForwardObjectiveFunction`

and `ForwardConstraintFunction`

, do the following:

`MOI.get(model, DiffOpt.ForwardVariablePrimal(), vi)`

`DiffOpt.IndexMappedFunction`

— Type`IndexMappedFunction{F<:MOI.AbstractFunction} <: AbstractLazyScalarFunction`

Lazily represents the function `MOI.Utilities.map_indices(index_map, DiffOpt.standard_form(func))`

.

`DiffOpt.MOItoJuMP`

— Type`MOItoJuMP{F<:MOI.AbstractScalarFunction} <: JuMP.AbstractJuMPScalar`

Lazily represents the function `JuMP.jump_function(model, DiffOpt.standard_form(func))`

.

`DiffOpt.MatrixScalarQuadraticFunction`

— Type```
struct MatrixScalarQuadraticFunction{T, VT, MT} <: MOI.AbstractScalarFunction
affine::VectorScalarAffineFunction{T,VT}
terms::MT
end
```

Represents the function `x' * terms * x / 2 + affine`

as an `MOI.AbstractScalarFunction`

where `x[i] = MOI.VariableIndex(i)`

. Use `standard_form`

to convert it to a `MOI.ScalarQuadraticFunction{T}`

.

`DiffOpt.MatrixVectorAffineFunction`

— Type`MatrixVectorAffineFunction{T, VT} <: MOI.AbstractVectorFunction`

Represents the function `terms * x + constant`

as an `MOI.AbstractVectorFunction`

where `x[i] = MOI.VariableIndex(i)`

. Use `standard_form`

to convert it to a `MOI.VectorAffineFunction{T}`

.

`DiffOpt.ModelConstructor`

— Type`ModelConstructor <: MOI.AbstractOptimizerAttribute`

Determines which subtype of `DiffOpt.AbstractModel`

to use for differentiation. When set to `nothing`

, the first one out of `model.model_constructors`

that support the problem is used.

`DiffOpt.ProductOfSets`

— Type`ProductOfSets{T} <: MOI.Utilities.OrderedProductOfSets{T}`

The `MOI.Utilities.@product_of_sets`

macro requires to know the list of sets at compile time. In DiffOpt however, the list depends on what the user is going to use as set as DiffOpt supports any set as long as it implements the required function of MathOptSetDistances. For this type, the list of sets can be given a run-time.

`DiffOpt.ReverseConstraintFunction`

— Type`ReverseConstraintFunction`

An `MOI.AbstractConstraintAttribute`

to get output data to reverse differentiation, that is, problem input data.

For instance, if the following returns `x + 2y + 5`

, it means that the tangent has coordinate `1`

for the coefficient of `x`

, coordinate `2`

for the coefficient of `y`

and `5`

for the function constant. If the constraint is of the form `func == constant`

or `func <= constant`

, the tangent for the constant on the right-hand side is `-5`

.

`MOI.get(model, DiffOpt.ReverseConstraintFunction(), ci)`

`DiffOpt.ReverseObjectiveFunction`

— Type`ReverseObjectiveFunction <: MOI.AbstractModelAttribute`

A `MOI.AbstractModelAttribute`

to get output data to reverse differentiation, that is, problem input data.

For instance, to get the tangent of the objective function corresponding to the tangent given to `ReverseVariablePrimal`

, do the following:

`func = MOI.get(model, DiffOpt.ReverseObjectiveFunction())`

Then, to get the sensitivity of the linear term with variable `x`

, do

`JuMP.coefficient(func, x)`

To get the sensitivity with respect to the quadratic term with variables `x`

and `y`

, do either

`JuMP.coefficient(func, x, y)`

or

`DiffOpt.quad_sym_half(func, x, y)`

These two lines are **not** equivalent in case `x == y`

, see `quad_sym_half`

for the details on the difference between these two functions.

`DiffOpt.ReverseVariablePrimal`

— Type`ReverseVariablePrimal <: MOI.AbstractVariableAttribute`

A `MOI.AbstractVariableAttribute`

to set input data to reverse differentiation, that is, problem solution.

For instance, to set the tangent of the variable of index `vi`

, do the following:

`MOI.set(model, DiffOpt.ReverseVariablePrimal(), x)`

`DiffOpt.VectorScalarAffineFunction`

— Type`VectorScalarAffineFunction{T, VT} <: MOI.AbstractScalarFunction`

Represents the function `x ⋅ terms + constant`

as an `MOI.AbstractScalarFunction`

where `x[i] = MOI.VariableIndex(i)`

. Use `standard_form`

to convert it to a `MOI.ScalarAffineFunction{T}`

.

`DiffOpt.Dπ`

— Method`Dπ(v::Vector{Float64}, model, cones::ProductOfSets)`

Given a `model`

, its `cones`

, find the gradient of the projection of the vectors `v`

of length equal to the number of rows in the conic form onto the cartesian product of the cones corresponding to these rows. For more info, refer to https://github.com/matbesancon/MathOptSetDistances.jl

`DiffOpt.add_all_model_constructors`

— Method`add_all_model_constructors(model)`

Add all constructors of `AbstractModel`

defined in this package to `model`

with `add_model_constructor`

.

`DiffOpt.add_model_constructor`

— Methodadd*model*constructor(optimizer::Optimizer, model_constructor)

Add the constructor of `AbstractModel`

for `optimizer`

to choose from when trying to differentiate.

`DiffOpt.diff_optimizer`

— Method`diff_optimizer(optimizer_constructor)::Optimizer`

Creates a `DiffOpt.Optimizer`

, which is an MOI layer with an internal optimizer and other utility methods. Results (primal, dual and slack values) are obtained by querying the internal optimizer instantiated using the `optimizer_constructor`

. These values are required for find jacobians with respect to problem data.

One define a differentiable model by using any solver of choice. Example:

```
julia> import DiffOpt, HiGHS
julia> model = DiffOpt.diff_optimizer(HiGHS.Optimizer)
julia> x = model.add_variable(model)
julia> model.add_constraint(model, ...)
```

`DiffOpt.forward_differentiate!`

— Function`forward_differentiate!(model::Optimizer)`

Wrapper method for the forward pass. This method will consider as input a currently solved problem and differentials with respect to problem data set with the `ForwardObjectiveFunction`

and `ForwardConstraintFunction`

attributes. The output solution differentials can be queried with the attribute `ForwardVariablePrimal`

.

`DiffOpt.map_rows`

— Method`map_rows(f::Function, model, cones::ProductOfSets, map_mode::Union{Nested{T}, Flattened{T}})`

Given a `model`

, its `cones`

and `map_mode`

of type `Nested`

(resp. `Flattened`

), return a `Vector{T}`

of length equal to the number of cones (resp. rows) in the conic form where the value for the index (resp. rows) corresponding to each cone is equal to `f(ci, r)`

where `ci`

is the corresponding constraint index in `model`

and `r`

is a `UnitRange`

of the corresponding rows in the conic form.

`DiffOpt.quad_sym_half`

— Function`quad_sym_half(func, vi1::MOI.VariableIndex, vi2::MOI.VariableIndex)`

Return `Q[i,j] = Q[j,i]`

where the quadratic terms of `func`

is represented by `x' Q x / 2`

for a symmetric matrix `Q`

where `x[i] = vi1`

and `x[j] = vi2`

. Note that while this is equal to `JuMP.coefficient(func, vi1, vi2)`

if `vi1 != vi2`

, in the case `vi1 == vi2`

, it is rather equal to `2JuMP.coefficient(func, vi1, vi2)`

.

`DiffOpt.reverse_differentiate!`

— Function`reverse_differentiate!(model::MOI.ModelLike)`

Wrapper method for the backward pass / reverse differentiation. This method will consider as input a currently solved problem and differentials with respect to the solution set with the `ReverseVariablePrimal`

attribute. The output problem data differentials can be queried with the attributes `ReverseObjectiveFunction`

and `ReverseConstraintFunction`

.

`DiffOpt.standard_form`

— Function`standard_form(func::AbstractLazyScalarFunction)`

Converts `func`

to a standard MOI scalar function.

`standard_form(func::MOItoJuMP)`

Converts `func`

to a standard JuMP scalar function.

`DiffOpt.π`

— Method`π(v::Vector{Float64}, model::MOI.ModelLike, cones::ProductOfSets)`

Given a `model`

, its `cones`

, find the projection of the vectors `v`

of length equal to the number of rows in the conic form onto the cartesian product of the cones corresponding to these rows. For more info, refer to https://github.com/matbesancon/MathOptSetDistances.jl

`DiffOpt.QuadraticProgram.Model`

— Type`DiffOpt.QuadraticProgram.Model <: DiffOpt.AbstractModel`

Model to differentiate quadratic programs.

For the reverse differentiation, it differentiates the optimal solution `z`

and return product of jacobian matrices (`dz / dQ`

, `dz / dq`

, etc) with the backward pass vector `dl / dz`

The method computes the product of

- jacobian of problem solution
`z*`

with respect to problem parameters set with the`DiffOpt.ReverseVariablePrimal`

- a backward pass vector
`dl / dz`

, where`l`

can be a loss function

Note that this method *does not returns* the actual jacobians.

For more info refer eqn(7) and eqn(8) of https://arxiv.org/pdf/1703.00443.pdf

`DiffOpt.QuadraticProgram.create_LHS_matrix`

— Function`create_LHS_matrix(z, λ, Q, G, h, A=nothing)`

Inverse matrix specified on RHS of eqn(7) in https://arxiv.org/pdf/1703.00443.pdf

Helper method while calling `reverse_differentiate!`

`DiffOpt.ConicProgram.Model`

— Type`Diffopt.ConicProgram.Model <: DiffOpt.AbstractModel`

Model to differentiate conic programs.

The forward differentiation computes the product of the derivative (Jacobian) at the conic program parameters `A`

, `b`

, `c`

to the perturbations `dA`

, `db`

, `dc`

.

The reverse differentiation computes the product of the transpose of the derivative (Jacobian) at the conic program parameters `A`

, `b`

, `c`

to the perturbations `dx`

, `dy`

, `ds`

.

For theoretical background, refer Section 3 of Differentiating Through a Cone Program, https://arxiv.org/abs/1904.09043