# Usage

Create a differentiable model from existing optimizers

using JuMP
import DiffOpt
import SCS

model = DiffOpt.diff_optimizer(SCS.Optimizer)

Update and solve the model

x = MOI.add_variables(model, 2)

MOI.optimize!(model)

Finally differentiate the model (primal and dual variables specifically) to obtain product of jacobians with respect to problem parameters and a backward pass vector. Currently DiffOpt supports two backends for differentiating a model:

1. To differentiate Convex Quadratic Program
\begin{align*} & \min_{x \in \mathbb{R}^n} & \frac{1}{2} x^T Q x + q^T x & \\ & \text{s.t.} & A x = b \qquad & b \in \mathbb{R}^m \\ & & G x \leq h \qquad & h \in \mathbb{R}^p \end{align*}

we can use the reverse_differentiate! method

MOI.set.(model,
DiffOpt.ReverseVariablePrimal(), x, ones(2))
DiffOpt.reverse_differentiate!(model)
grad_con = MOI.get.(model, DiffOpt.ReverseConstraintFunction(), c)
\begin{align*} & \min_{x \in \mathbb{R}^n} & c^T x \\ & \text{s.t.} & A x + s = b \\ & & b \in \mathbb{R}^m \\ & & s \in \mathcal{K} \end{align*}
we can use the forward_differentiate! method with perturbations in matrices A, b, c:
import LinearAlgebra: ⋅
grad_x = MOI.get.(model, DiffOpt.ForwardVariablePrimal(), x)