DiffOpt.jl
DiffOpt.jl is a package for differentiating convex optimization programs with respect to the program parameters. DiffOpt currently supports linear, quadratic, and conic programs.
License
DiffOpt.jl
is licensed under the MIT License.
Installation
Install DiffOpt using Pkg.add
:
import Pkg
Pkg.add("DiffOpt")
Documentation
The documentation for DiffOpt.jl includes a detailed description of the theory behind the package, along with examples, tutorials, and an API reference.
Use with JuMP
DiffOpt-JuMP API with Parameters
using JuMP, DiffOpt, HiGHS
model = Model(
() -> DiffOpt.diff_optimizer(
HiGHS.Optimizer;
with_parametric_opt_interface = true,
),
)
set_silent(model)
p_val = 4.0
pc_val = 2.0
@variable(model, x)
@variable(model, p in Parameter(p_val))
@variable(model, pc in Parameter(pc_val))
@constraint(model, cons, pc * x >= 3 * p)
@objective(model, Min, 2x)
optimize!(model)
@show value(x) == 3 * p_val / pc_val
# the function is
# x(p, pc) = 3p / pc
# hence,
# dx/dp = 3 / pc
# dx/dpc = -3p / pc^2
# First, try forward mode AD
# differentiate w.r.t. p
direction_p = 3.0
MOI.set(model, DiffOpt.ForwardConstraintSet(), ParameterRef(p), Parameter(direction_p))
DiffOpt.forward_differentiate!(model)
@show MOI.get(model, DiffOpt.ForwardVariablePrimal(), x) == direction_p * 3 / pc_val
# update p and pc
p_val = 2.0
pc_val = 6.0
set_parameter_value(p, p_val)
set_parameter_value(pc, pc_val)
# re-optimize
optimize!(model)
# check solution
@show value(x) ≈ 3 * p_val / pc_val
# stop differentiating with respect to p
DiffOpt.empty_input_sensitivities!(model)
# differentiate w.r.t. pc
direction_pc = 10.0
MOI.set(model, DiffOpt.ForwardConstraintSet(), ParameterRef(pc), Parameter(direction_pc))
DiffOpt.forward_differentiate!(model)
@show abs(MOI.get(model, DiffOpt.ForwardVariablePrimal(), x) -
-direction_pc * 3 * p_val / pc_val^2) < 1e-5
# always a good practice to clear previously set sensitivities
DiffOpt.empty_input_sensitivities!(model)
# Now, reverse model AD
direction_x = 10.0
MOI.set(model, DiffOpt.ReverseVariablePrimal(), x, direction_x)
DiffOpt.reverse_differentiate!(model)
@show MOI.get(model, DiffOpt.ReverseConstraintSet(), ParameterRef(p)) == MOI.Parameter(direction_x * 3 / pc_val)
@show abs(MOI.get(model, DiffOpt.ReverseConstraintSet(), ParameterRef(pc)).value -
-direction_x * 3 * p_val / pc_val^2) < 1e-5
Low level DiffOpt-JuMP API:
A brief example:
using JuMP, DiffOpt, HiGHS
# Create a model using the wrapper
model = Model(() -> DiffOpt.diff_optimizer(HiGHS.Optimizer))
# Define your model and solve it
@variable(model, x)
@constraint(model, cons, x >= 3)
@objective(model, Min, 2x)
optimize!(model)
# Choose the problem parameters to differentiate with respect to, and set their
# perturbations.
MOI.set(model, DiffOpt.ReverseVariablePrimal(), x, 1.0)
# Differentiate the model
DiffOpt.reverse_differentiate!(model)
# fetch the gradients
grad_exp = MOI.get(model, DiffOpt.ReverseConstraintFunction(), cons) # -3 x - 1
constant(grad_exp) # -1
coefficient(grad_exp, x) # -3
Citing DiffOpt.jl
If you find DiffOpt.jl useful in your work, we kindly request that you cite the following paper:
@article{besancon2023diffopt,
title={Flexible Differentiable Optimization via Model Transformations},
author={Besançon, Mathieu and Dias Garcia, Joaquim and Legat, Beno{\^\i}t and Sharma, Akshay},
journal={INFORMS Journal on Computing},
year={2023},
volume={36},
number={2},
pages={456--478},
doi={10.1287/ijoc.2022.0283},
publisher={INFORMS}
}
A preprint of this paper is freely available.
GSOC2020
DiffOpt began as a NumFOCUS sponsored Google Summer of Code (2020) project