Reference
DiffOpt.AbstractLazyScalarFunctionDiffOpt.BackwardInVariablePrimalDiffOpt.BackwardOutConstraintDiffOpt.BackwardOutObjectiveDiffOpt.ForwardInConstraintDiffOpt.ForwardInObjectiveDiffOpt.ForwardOutVariablePrimalDiffOpt.IndexMappedFunctionDiffOpt.MOItoJuMPDiffOpt.MatrixScalarQuadraticFunctionDiffOpt.MatrixVectorAffineFunctionDiffOpt.ProductOfSetsDiffOpt.ProgramClassDiffOpt.ProgramClassCodeDiffOpt.ProgramClassUsedDiffOpt.VectorScalarAffineFunctionDiffOpt.DπDiffOpt.backwardDiffOpt.backwardDiffOpt.backwardDiffOpt.create_LHS_matrixDiffOpt.diff_optimizerDiffOpt.diff_optimizerDiffOpt.forwardDiffOpt.forwardDiffOpt.forwardDiffOpt.map_rowsDiffOpt.quad_sym_halfDiffOpt.standard_formDiffOpt.π
DiffOpt.AbstractLazyScalarFunction — Typeabstract type AbstractLazyScalarFunction <: MOI.AbstractScalarFunction endSubtype of MOI.AbstractScalarFunction that is not a standard MOI scalar function but can be converted to one using standard_form.
The function can also be inspected lazily using JuMP.coefficient or quad_sym_half.
DiffOpt.BackwardInVariablePrimal — TypeBackwardInVariablePrimal <: MOI.AbstractVariableAttributeA MOI.AbstractVariableAttribute to set input data to backward differentiation, that is, problem solution.
For instance, to set the tangent of the variable of index vi, do the following:
MOI.set(model, DiffOpt.BackwardInVariablePrimal(), x)DiffOpt.BackwardOutConstraint — TypeBackwardOutConstraintAn MOI.AbstractConstraintAttribute to get output data to backward differentiation, that is, problem input data.
For instance, if the following returns x + 2y + 5, it means that the tangent has coordinate 1 for the coefficient of x, coordinate 2 for the coefficient of y and 5 for the function constant. If the constraint is of the form func == constant or func <= constant, the tangent for the constant on the right-hand side is -5.
MOI.get(model, DiffOpt.BackwardOutConstraint(), ci)DiffOpt.BackwardOutObjective — TypeBackwardOutObjective <: MOI.AbstractModelAttributeA MOI.AbstractModelAttribute to get output data to backward differentiation, that is, problem input data.
For instance, to get the tangent of the objective function corresponding to the tangent given to BackwardInVariablePrimal, do the following:
func = MOI.get(model, DiffOpt.BackwardOutObjective())Then, to get the sensitivity of the linear term with variable x, do
JuMP.coefficient(func, x)To get the sensitivity with respect to the quadratic term with variables x and y, do either
JuMP.coefficient(func, x, y)or
DiffOpt.quad_sym_half(func, x, y)These two lines are not equivalent in case x == y, see quad_sym_half for the details on the difference between these two functions.
DiffOpt.ForwardInConstraint — TypeForwardInConstraint <: MOI.AbstractConstraintAttributeA MOI.AbstractConstraintAttribute to set input data to forward differentiation, that is, problem input data.
For instance, if the scalar constraint of index ci contains θ * (x + 2y) <= 5θ, for the purpose of computing the derivative with respect to θ, the following should be set:
MOI.set(model, DiffOpt.ForwardInConstraint(), ci, 1.0 * x + 2.0 * y - 5.0)Note that we use -5 as the ForwardInConstraint sets the tangent of the ConstraintFunction so we consider the expression θ * (x + 2y - 5).
DiffOpt.ForwardInObjective — TypeForwardInObjective <: MOI.AbstractModelAttributeA MOI.AbstractModelAttribute to set input data to forward differentiation, that is, problem input data. The possible values are any MOI.AbstractScalarFunction. A MOI.ScalarQuadraticFunction can only be used in linearly constrained quadratic models.
For instance, if the objective contains θ * (x + 2y), for the purpose of computing the derivative with respect to θ, the following should be set:
MOI.set(model, DiffOpt.ForwardInObjective(), 1.0 * x + 2.0 * y)where x and y are the relevant MOI.VariableIndex.
DiffOpt.ForwardOutVariablePrimal — TypeForwardOutVariablePrimal <: MOI.AbstractVariableAttributeA MOI.AbstractVariableAttribute to get output data from forward differentiation, that is, problem solution.
For instance, to get the tangent of the variable of index vi corresponding to the tangents given to ForwardInObjective and ForwardInConstraint, do the following:
MOI.get(model, DiffOpt.ForwardOutVariablePrimal(), vi)DiffOpt.IndexMappedFunction — TypeIndexMappedFunction{F<:MOI.AbstractFunction} <: AbstractLazyScalarFunctionLazily represents the function MOI.Utilities.map_indices(index_map, DiffOpt.standard_form(func)).
DiffOpt.MOItoJuMP — TypeMOItoJuMP{F<:MOI.AbstractScalarFunction} <: JuMP.AbstractJuMPScalarLazily represents the function JuMP.jump_function(model, DiffOpt.standard_form(func)).
DiffOpt.MatrixScalarQuadraticFunction — Typestruct MatrixScalarQuadraticFunction{T, VT, MT} <: MOI.AbstractScalarFunction
affine::VectorScalarAffineFunction{T,VT}
terms::MT
endRepresents the function x' * terms * x / 2 + affine as an MOI.AbstractScalarFunction where x[i] = MOI.VariableIndex(i). Use standard_form to convert it to a MOI.ScalarQuadraticFunction{T}.
DiffOpt.MatrixVectorAffineFunction — TypeMatrixVectorAffineFunction{T, VT} <: MOI.AbstractVectorFunctionRepresents the function terms * x + constant as an MOI.AbstractVectorFunction where x[i] = MOI.VariableIndex(i). Use standard_form to convert it to a MOI.VectorAffineFunction{T}.
DiffOpt.ProductOfSets — TypeProductOfSets{T} <: MOI.Utilities.OrderedProductOfSets{T}The MOI.Utilities.@product_of_sets macro requires to know the list of sets at compile time. In DiffOpt however, the list depends on what the user is going to use as set as DiffOpt supports any set as long as it implements the required function of MathOptSetDistances. For this type, the list of sets can be given a run-time.
DiffOpt.ProgramClass — TypeProgramClass <: MOI.AbstractOptimizerAttributeDetermines which program class to used from ProgramClassCode. The default is AUTOMATIC.
One important advantage of setting the class explicitly is that it will allow necessary bridges to be used. If the class is AUTOMATIC then DiffOpt.Optimizer will report that it supports both objective and constraints of the QP and CP classes. For instance, it will reports that is supports both quadratic objective and conic constraints. However, at the differentiation stage, we won't be able to differentiate since QP does not support conic constraints and CP does not support quadratic objective. On the other hand, if the ProgramClass is set to CONIC then DiffOpt.Optimizer will report that it does not support quadratic objective hence it will be bridged to second-order cone constraints and we will be able to use CP to differentiate.
DiffOpt.ProgramClassCode — Type@enum ProgramClassCode QUADRATIC CONIC AUTOMATICProgram class used by DiffOpt. DiffOpt implements differentiation of two different program class:
- Quadratic Program (QP): quadratic objective and linear constraints and
- Conic Program (CP): linear objective and conic constraints.
AUTOMATIC which means that the class will be automatically selected given the problem data: if any constraint is conic, CP is used and QP is used otherwise. See ProgramClass.
DiffOpt.ProgramClassUsed — TypeProgramClassUsed <: MOI.AbstractOptimizerAttributeProgram class actually used, same as ProgramClass except that it does not return AUTOMATIC but the class automatically chosen instead. This attribute is read-only, it cannot be set, set ProgramClass instead.
DiffOpt.VectorScalarAffineFunction — TypeVectorScalarAffineFunction{T, VT} <: MOI.AbstractScalarFunctionRepresents the function x ⋅ terms + constant as an MOI.AbstractScalarFunction where x[i] = MOI.VariableIndex(i). Use standard_form to convert it to a MOI.ScalarAffineFunction{T}.
DiffOpt.Dπ — MethodDπ(v::Vector{Float64}, model, cones::ProductOfSets)Given a model, its cones, find the gradient of the projection of the vectors v of length equal to the number of rows in the conic form onto the cartesian product of the cones corresponding to these rows. For more info, refer to https://github.com/matbesancon/MathOptSetDistances.jl
DiffOpt.backward — Methodbackward(model::ConicDiff)Method to compute the product of the transpose of the derivative (Jacobian) at the conic program parameters A, b, c to the perturbations dx, dy, ds. This is similar to backward.
For theoretical background, refer Section 3 of Differentiating Through a Cone Program, https://arxiv.org/abs/1904.09043
DiffOpt.backward — Methodbackward(model::Optimizer)Wrapper method for the backward pass. This method will consider as input a currently solved problem and differentials with respect to the solution set with the BackwardInVariablePrimal attribute. The output problem data differentials can be queried with the attributes BackwardOutObjective and BackwardOutConstraint.
DiffOpt.backward — Methodbackward(model::QPDiff)Method to differentiate optimal solution z and return product of jacobian matrices (dz / dQ, dz / dq, etc) with the backward pass vector dl / dz
The method computes the product of
- jacobian of problem solution
z*with respect to problem parameters set with theBackwardInVariablePrimal - a backward pass vector
dl / dz, wherelcan be a loss function
Note that this method does not returns the actual jacobians.
For more info refer eqn(7) and eqn(8) of https://arxiv.org/pdf/1703.00443.pdf
DiffOpt.create_LHS_matrix — Functioncreate_LHS_matrix(z, λ, Q, G, h, A=nothing)Inverse matrix specified on RHS of eqn(7) in https://arxiv.org/pdf/1703.00443.pdf
Helper method while calling _backward_quad
DiffOpt.diff_optimizer — Methoddiff_optimizer(optimizer_constructor)::OptimizerCreates a DiffOpt.Optimizer, which is an MOI layer with an internal optimizer and other utility methods. Results (primal, dual and slack values) are obtained by querying the internal optimizer instantiated using the optimizer_constructor. These values are required for find jacobians with respect to problem data.
One define a differentiable model by using any solver of choice. Example:
julia> import DiffOpt, HiGHS
julia> model = DiffOpt.diff_optimizer(HiGHS.Optimizer)
julia> model.add_variable(x)
julia> model.add_constraint(...)
julia> _backward_quad(model) # for convex quadratic models
julia> _backward_quad(model) # for convex conic modelsDiffOpt.forward — Methodforward(model::ConicDiff)Method to compute the product of the derivative (Jacobian) at the conic program parameters A, b, c to the perturbations dA, db, dc. This is similar to forward.
For theoretical background, refer Section 3 of Differentiating Through a Cone Program, https://arxiv.org/abs/1904.09043
DiffOpt.forward — Methodforward(model::Optimizer)Wrapper method for the forward pass. This method will consider as input a currently solved problem and differentials with respect to problem data set with the ForwardInObjective and ForwardInConstraint attributes. The output solution differentials can be queried with the attribute ForwardOutVariablePrimal.
DiffOpt.forward — Methodforward(model::QPDiff)DiffOpt.map_rows — Methodmap_rows(f::Function, model, cones::ProductOfSets, map_mode::Union{Nested{T}, Flattened{T}})Given a model, its cones and map_mode of type Nested (resp. Flattened), return a Vector{T} of length equal to the number of cones (resp. rows) in the conic form where the value for the index (resp. rows) corresponding to each cone is equal to f(ci, r) where ci is the corresponding constraint index in model and r is a UnitRange of the corresponding rows in the conic form.
DiffOpt.quad_sym_half — Functionquad_sym_half(func, vi1::MOI.VariableIndex, vi2::MOI.VariableIndex)Return Q[i,j] = Q[j,i] where the quadratic terms of func is represented by x' Q x / 2 for a symmetric matrix Q where x[i] = vi1 and x[j] = vi2. Note that while this is equal to JuMP.coefficient(func, vi1, vi2) if vi1 != vi2, in the case vi1 == vi2, it is rather equal to 2JuMP.coefficient(func, vi1, vi2).
DiffOpt.standard_form — Functionstandard_form(func::AbstractLazyScalarFunction)Converts func to a standard MOI scalar function.
standard_form(func::MOItoJuMP)Converts func to a standard JuMP scalar function.
DiffOpt.π — Methodπ(v::Vector{Float64}, model::MOI.ModelLike, cones::ProductOfSets)Given a model, its cones, find the projection of the vectors v of length equal to the number of rows in the conic form onto the cartesian product of the cones corresponding to these rows. For more info, refer to https://github.com/matbesancon/MathOptSetDistances.jl