You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/index.md
+4-4
Original file line number
Diff line number
Diff line change
@@ -1,10 +1,10 @@
1
1
# DiffOpt.jl
2
2
3
-
[DiffOpt.jl](https://github.com/jump-dev/DiffOpt.jl) is a package for differentiating convex optimization program ([JuMP.jl](https://github.com/jump-dev/JuMP.jl) or [MathOptInterface.jl](https://github.com/jump-dev/MathOptInterface.jl) models) with respect to program parameters. Note that this package does not contain any solver.
3
+
[DiffOpt.jl](https://github.com/jump-dev/DiffOpt.jl) is a package for differentiating convex and non-convex optimization program ([JuMP.jl](https://github.com/jump-dev/JuMP.jl) or [MathOptInterface.jl](https://github.com/jump-dev/MathOptInterface.jl) models) with respect to program parameters. Note that this package does not contain any solver.
4
4
This package has two major backends, available via the `reverse_differentiate!` and `forward_differentiate!` methods, to differentiate models (quadratic or conic) with optimal solutions.
5
5
6
6
!!! note
7
-
Currently supports *linear programs* (LP), *convex quadratic programs* (QP) and *convex conic programs* (SDP, SOCP, exponential cone constraints only).
@@ -16,8 +16,8 @@ DiffOpt can be installed through the Julia package manager:
16
16
17
17
## Why are Differentiable optimization problems important?
18
18
19
-
Differentiable optimization is a promising field of convex optimization and has many potential applications in game theory, control theory and machine learning (specifically deep learning - refer [this video](https://www.youtube.com/watch?v=NrcaNnEXkT8) for more).
20
-
Recent work has shown how to differentiate specific subclasses of convex optimization problems. But several applications remain unexplored (refer section 8 of this [really good thesis](https://github.com/bamos/thesis)). With the help of automatic differentiation, differentiable optimization can have a significant impact on creating end-to-end differentiable systems to model neural networks, stochastic processes, or a game.
19
+
Differentiable optimization is a promising field of constrained optimization and has many potential applications in game theory, control theory and machine learning (specifically deep learning - refer [this video](https://www.youtube.com/watch?v=NrcaNnEXkT8) for more).
20
+
Recent work has shown how to differentiate specific subclasses of constrained optimization problems. But several applications remain unexplored (refer section 8 of this [really good thesis](https://github.com/bamos/thesis)). With the help of automatic differentiation, differentiable optimization can have a significant impact on creating end-to-end differentiable systems to model neural networks, stochastic processes, or a game.
3. To differentiate a general nonlinear program, have to use the API for Parameterized JuMP models. For example, consider the following nonlinear program:
61
+
62
+
```julia
63
+
using JuMP, DiffOpt, HiGHS
64
+
65
+
model =Model(() -> DiffOpt.diff_optimizer(Ipopt.Optimizer))
0 commit comments