Percival.jl - An augmented Lagrangian solver
Percival is an implementation of the augmented Lagrangian solver described in
S. Arreckx, A. Lambe, Martins, J. R. R. A., & Orban, D. (2016).
A Matrix-Free Augmented Lagrangian Algorithm with Application to Large-Scale Structural Design Optimization.
Optimization And Engineering, 17, 359–384. doi:10.1007/s11081-015-9287-9
with internal solver tron
from JSOSolvers.jl. To use Percival, you have to pass it an NLPModel.
How to Cite
If you use Percival.jl in your work, please cite using the format given in CITATION.bib.
Install
Use ]
to enter pkg>
mode of Julia, then
pkg> add Percival
Examples
Consider the following 2-dimensional optimization problem with an equality constraint
\[\begin{equation} \min_{(x_1,x_2)} \quad (x_1 - 1)^2 + 100 (x_2 - x_1^2)^2 \quad \text{s.to} \quad x_1^2 + x_2^2 = 1. \end{equation}\]
You can solve an JuMP model model
by using NLPModelsJuMP.jl to convert it.
using JuMP, NLPModelsJuMP, Percival
model = Model(NLPModelsJuMP.Optimizer)
set_attribute(model, "solver", Percival.PercivalSolver)
@variable(model, x[i=1:2], start = [-1.2; 1.0][i])
@objective(model, Min, (x[1] - 1)^2 + 100 * (x[2] - x[1]^2)^2)
@constraint(model, x[1]^2 + x[2]^2 == 1)
optimize!(model)
solution_summary(model)
percival
accept as input any instance of AbstractNLPModel
, for instance, using automatic differentiation via ADNLPModels.jl to solve the same problem.
using ADNLPModels, Percival
nlp = ADNLPModel(
x -> (x[1] - 1)^2 + 100 * (x[2] - x[1]^2)^2,
[-1.2; 1.0],
x -> [x[1]^2 + x[2]^2],
[1.0],
[1.0],
)
output = percival(nlp, verbose = 1)
Bug reports and discussions
If you think you found a bug, feel free to open an issue. Focused suggestions and requests can also be opened as issues. Before opening a pull request, start an issue or a discussion on the topic, please.
If you want to ask a question not suited for a bug report, feel free to start a discussion here. This forum is for general discussion about this repository and the JuliaSmoothOptimizers, so questions about any of our packages are welcome.