Skip to content

Improve documentation of algorithm options #554

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 13 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
# AI
CLAUDE.md

# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
Expand Down
7 changes: 3 additions & 4 deletions docs/rtd_environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,15 @@ channels:
- conda-forge
- nodefaults
dependencies:
- python=3.11
- python=3.12
- typing-extensions
- pip
- setuptools_scm
- toml
- sphinx
- sphinx>=8.2.3
- sphinxcontrib-bibtex
- sphinx-copybutton
- sphinx-design
- sphinx-panels
- ipython
- ipython_genutils
- myst-nb
Expand All @@ -31,7 +30,7 @@ dependencies:
- annotated-types
- pygmo>=2.19.0
- pip:
- ../
- -e ../
- kaleido>=1.0
- Py-BOBYQA
- DFO-LS
Expand Down
66 changes: 23 additions & 43 deletions docs/source/algorithms.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,62 +3,42 @@
# Optimizers

Check out {ref}`how-to-select-algorithms` to see how to select an algorithm and specify
`algo_options` when using `maximize` or `minimize`.
`algo_options` when using `maximize` or `minimize`. The default algorithm options are
discussed in {ref}`algo_options` and their type hints are documented in {ref}`typing`.

## Optimizers from scipy
## Optimizers from SciPy

(scipy-algorithms)=

optimagic supports most `scipy` algorithms and scipy is automatically installed when you
install optimagic.
optimagic supports most [SciPy](https://scipy.org/) algorithms and SciPy is
automatically installed when you install optimagic.

```{eval-rst}
.. dropdown:: scipy_lbfgsb

.. code-block::

"scipy_lbfgsb"

Minimize a scalar function of one or more variables using the L-BFGS-B algorithm.

The optimizer is taken from scipy, which calls the Fortran code written by the
original authors of the algorithm. The Fortran code includes the corrections
and improvements that were introduced in a follow up paper.

lbfgsb is a limited memory version of the original bfgs algorithm, that deals with
lower and upper bounds via an active set approach.
**How to use this algorithm:**

The lbfgsb algorithm is well suited for differentiable scalar optimization problems
with up to several hundred parameters.

It is a quasi-newton line search algorithm. At each trial point it evaluates the
criterion function and its gradient to find a search direction. It then approximates
the hessian using the stored history of gradients and uses the hessian to calculate
a candidate step size. Then it uses a gradient based line search algorithm to
determine the actual step length. Since the algorithm always evaluates the gradient
and criterion function jointly, the user should provide a
``criterion_and_derivative`` function that exploits the synergies in the
calculation of criterion and gradient.

The lbfgsb algorithm is almost perfectly scale invariant. Thus, it is not necessary
to scale the parameters.

- **convergence.ftol_rel** (float): Stop when the relative improvement
between two iterations is smaller than this. More formally, this is expressed as
.. code-block::

.. math::
import optimagic as om
om.minimize(
...,
algorithm=om.algos.scipy_lbfgsb(stopping_maxiter=1_000, ...)
)

or

.. code-block::

\frac{(f^k - f^{k+1})}{\\max{{|f^k|, |f^{k+1}|, 1}}} \leq
\text{relative_criterion_tolerance}
om.minimize(
...,
algorithm="scipy_lbfgsb",
algo_options={"stopping_maxiter": 1_000, ...}
)

**Description and available options:**

- **convergence.gtol_abs** (float): Stop if all elements of the projected
gradient are smaller than this.
- **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this as convergence.
- **stopping.maxiter** (int): If the maximum number of iterations is reached,
the optimization stops, but we do not count this as convergence.
- **limited_memory_storage_length** (int): Maximum number of saved gradients used to approximate the hessian matrix.
.. autoclass:: optimagic.optimizers.scipy_optimizers.ScipyLBFGSB

```

Expand Down
30 changes: 25 additions & 5 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,6 @@
"sphinx_copybutton",
"myst_nb",
"sphinxcontrib.bibtex",
"sphinx_panels",
"sphinx_design",
"sphinxcontrib.mermaid",
]
Expand All @@ -67,6 +66,27 @@
bibtex_bibfiles = ["refs.bib"]

autodoc_member_order = "bysource"
autodoc_class_signature = "separated"
autodoc_default_options = {
"exclude-members": "__init__",
"members": True,
"undoc-members": True,
"member-order": "bysource",
"class-doc-from": "class",
}
autodoc_preserve_defaults = True
autodoc_type_aliases = {
"PositiveInt": "optimagic.typing.PositiveInt",
"NonNegativeInt": "optimagic.typing.NonNegativeInt",
"PositiveFloat": "optimagic.typing.PositiveFloat",
"NonNegativeFloat": "optimagic.typing.NonNegativeFloat",
"NegativeFloat": "optimagic.typing.NegativeFloat",
"GtOneFloat": "optimagic.typing.GtOneFloat",
"YesNoBool": "optimagic.typing.YesNoBool",
"DirectionLiteral": "optimagic.typing.DirectionLiteral",
"BatchEvaluatorLiteral": "optimagic.typing.BatchEvaluatorLiteral",
"ErrorHandlingLiteral": "optimagic.typing.ErrorHandlingLiteral",
}

autodoc_mock_imports = [
"bokeh",
Expand All @@ -86,8 +106,8 @@
]

extlinks = {
"ghuser": ("https://github.com/%s", "@"),
"gh": ("https://github.com/optimagic-dev/optimagic/pulls/%s", "#"),
"ghuser": ("https://github.com/%s", "%s"),
"gh": ("https://github.com/optimagic-dev/optimagic/pull/%s", "%s"),
}

intersphinx_mapping = get_intersphinx_mapping(
Expand Down Expand Up @@ -126,7 +146,7 @@
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
language = None
language = "en"

# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
Expand All @@ -145,7 +165,7 @@
todo_emit_warnings = True

# -- Options for myst-nb ----------------------------------------
nb_execution_mode = "force"
nb_execution_mode = "force" # "off", "force", "cache", "auto"
nb_execution_allow_errors = False
nb_merge_streams = True

Expand Down
4 changes: 3 additions & 1 deletion docs/source/explanation/internal_optimizers.md
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,9 @@ To make switching between different algorithm as simple as possible, we align th
of commonly used convergence and stopping criteria. We also align the default values for
stopping and convergence criteria as much as possible.

You can find the harmonized names and value [here](algo_options_docs).
```{eval-rst}
You can find the harmonized names and value here: :ref:`algo_options`.
```

To align the names of other tuning parameters as much as possible with what is already
there, simple have a look at the optimizers we already wrapped. For example, if you are
Expand Down
Loading
Loading