You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This works, but the downside is that when computing the Hessian-of-the-Lagrangian we compute one (dense) Hessian for each row in the output of y. The Hessian evaluation is a bottleneck in @Robbybp's work.
He has a solution that hacks abuses the internal API of Ipopt.jl to take the y = F(x) constraints onto the end of the NLP evaluator, and there is a fast way in PyTorch to compute $u' F_{xx}(x)$ which is used in eval_hessian_lagrangian. But it'd be nice if there was first-class support in Ipopt.
There is an open issue in MOI to add vector-valued nonlinear expressions: jump-dev/MathOptInterface.jl#2402. The main thrust is vector-input, but vector-output is also desirable.
If this works, I think it would solve most of the complains for vector-valued nonlinear, and allow us to punt on the implementation a little longer.
Alternatives
We might consider
F(x) = 0
this would implicitly allow
g(y) = F(x)
but it would mean the standard "I have a user-defined function" in JuMP is more complicated.
The text was updated successfully, but these errors were encountered:
Thanks for the writeup! I verified that my Ipopt hack, my "Pytorch Lagrangian", and MathOptAI all work well together, so next week I'll try to put together a GrayBoxOpt.jl package that demos my approach.
This is hopefully a summary of a good chat that I had with @Robbybp today.
Motivation
We have a standard Ipopt NLP:
that we want to add a vector-valued nonlinear function to:
Vector-valued user-defined functions crop up all the time on the forum, especially in the optimal control community. Currently we tell them to follow https://jump.dev/JuMP.jl/dev/tutorials/nonlinear/tips_and_tricks/, which is pretty hacky.
We currently support this in MathOptAI: https://github.com/lanl-ansi/MathOptAI.jl/blob/53121ff5fbdae73f4750272d2545000bf5c67e97/src/predictors/GrayBox.jl#L7-L140
This works, but the downside is that when computing the Hessian-of-the-Lagrangian we compute one (dense) Hessian for each row in the output of
y
. The Hessian evaluation is a bottleneck in @Robbybp's work.He has a solution that hacks$u' F_{xx}(x)$ which is used in
abusesthe internal API of Ipopt.jl to take they = F(x)
constraints onto the end of the NLP evaluator, and there is a fast way in PyTorch to computeeval_hessian_lagrangian
. But it'd be nice if there was first-class support in Ipopt.Solution
Define a new set in Ipopt.jl
Make Ipopt support
This would tack on the appropriate calls to
Ipopt.jl/src/MOI_wrapper.jl
Lines 919 to 925 in 6bde96f
Here's the API I have in mind:
Related
There is an open issue in MOI to add vector-valued nonlinear expressions: jump-dev/MathOptInterface.jl#2402. The main thrust is vector-input, but vector-output is also desirable.
If this works, I think it would solve most of the complains for vector-valued nonlinear, and allow us to punt on the implementation a little longer.
Alternatives
We might consider
this would implicitly allow
but it would mean the standard "I have a user-defined function" in JuMP is more complicated.
The text was updated successfully, but these errors were encountered: