Skip to content

Add optimizer from bayesian-optimization #602

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 27 commits into from
Jul 18, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
3561d03
initial bayesian-optimization wrapper
spline2hg Jun 5, 2025
a008bf1
fix: import error
spline2hg Jun 6, 2025
f9d64ac
add docs
spline2hg Jun 6, 2025
e86ccae
fix: pin bayesian-optimization to >=2.0.4
spline2hg Jun 8, 2025
824a153
fix: mypy error
spline2hg Jun 8, 2025
a5600cc
fix: bayes_opt imports
spline2hg Jun 9, 2025
331fb9e
fix: check for failed optimization
spline2hg Jun 9, 2025
d0c6a7a
Define aliases for acquisition function
spline2hg Jun 9, 2025
d2d2e9f
fix review changes
spline2hg Jun 23, 2025
5c0e448
add tests
spline2hg Jun 23, 2025
04de5a7
fix: _extract_params_from_kwargs and reformat tests
spline2hg Jun 28, 2025
4347526
Add type fallbacks for optional bayesian-optimization imports
spline2hg Jul 5, 2025
19ef242
fix: optimizer type
spline2hg Jul 5, 2025
4b419d6
Add UnitIntervalFloat
spline2hg Jul 5, 2025
8bffc41
Merge branch 'main' into bayesian_optimizer
timmens Jul 6, 2025
9d41915
fix: mypy version
spline2hg Jul 6, 2025
84c09d5
Merge branch 'main' into bayesian_optimizer
spline2hg Jul 7, 2025
1c67c84
fix: mypy
spline2hg Jul 7, 2025
a0a7a0c
Merge branch 'main' into bayesian_optimizer
janosg Jul 9, 2025
fd4194f
fix: handle optional bayes_opt imports
spline2hg Jul 12, 2025
7b5f279
Merge branch 'optimagic-dev:main' into bayesian_optimizer
spline2hg Jul 12, 2025
410ed7d
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Jul 12, 2025
b4147a1
refactor: apply TYPE_CHECKING pattern to iminuit
spline2hg Jul 15, 2025
f74383e
Merge remote-tracking branch 'origin/bayesian_optimizer' into bayesia…
spline2hg Jul 15, 2025
af215cd
Merge branch 'main' into bayesian_optimizer
janosg Jul 16, 2025
4c03edf
Merge branch 'main' into bayesian_optimizer
janosg Jul 16, 2025
9b0d845
add needs_bounds=True and supports_infinite_bounds=False to bayes_opt
spline2hg Jul 17, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .tools/envs/testenv-linux.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@ dependencies:
- annotated-types # dev, tests
- iminuit # dev, tests
- pip: # dev, tests, docs
- bayesian-optimization>=2.0.4 # dev, tests
- nevergrad # dev, tests
- DFO-LS>=1.5.3 # dev, tests
- Py-BOBYQA # dev, tests
Expand Down
1 change: 1 addition & 0 deletions .tools/envs/testenv-numpy.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ dependencies:
- annotated-types # dev, tests
- iminuit # dev, tests
- pip: # dev, tests, docs
- bayesian-optimization>=2.0.4 # dev, tests
- nevergrad # dev, tests
- DFO-LS>=1.5.3 # dev, tests
- Py-BOBYQA # dev, tests
Expand Down
1 change: 1 addition & 0 deletions .tools/envs/testenv-others.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ dependencies:
- annotated-types # dev, tests
- iminuit # dev, tests
- pip: # dev, tests, docs
- bayesian-optimization>=2.0.4 # dev, tests
- nevergrad # dev, tests
- DFO-LS>=1.5.3 # dev, tests
- Py-BOBYQA # dev, tests
Expand Down
1 change: 1 addition & 0 deletions .tools/envs/testenv-pandas.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ dependencies:
- annotated-types # dev, tests
- iminuit # dev, tests
- pip: # dev, tests, docs
- bayesian-optimization>=2.0.4 # dev, tests
- nevergrad # dev, tests
- DFO-LS>=1.5.3 # dev, tests
- Py-BOBYQA # dev, tests
Expand Down
1 change: 1 addition & 0 deletions .tools/envs/testenv-plotly.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ dependencies:
- annotated-types # dev, tests
- iminuit # dev, tests
- pip: # dev, tests, docs
- bayesian-optimization>=2.0.4 # dev, tests
- nevergrad # dev, tests
- DFO-LS>=1.5.3 # dev, tests
- Py-BOBYQA # dev, tests
Expand Down
81 changes: 81 additions & 0 deletions docs/source/algorithms.md
Original file line number Diff line number Diff line change
Expand Up @@ -4043,6 +4043,87 @@ these optimizers, you need to have
initialization for speed. Default is False.
```

## Bayesian Optimization

We wrap the
[BayesianOptimization](https://github.com/bayesian-optimization/BayesianOptimization)
package. To use it, you need to have
[bayesian-optimization](https://pypi.org/project/bayesian-optimization/) installed.

```{eval-rst}
.. dropdown:: bayes_opt

.. code-block::

"bayes_opt"

Minimize a scalar function using Bayesian Optimization with Gaussian Processes.

This optimizer wraps the BayesianOptimization package (:cite:`Nogueira2014`), which
implements Bayesian optimization using Gaussian processes to build probabilistic
models of the objective function. Bayesian optimization is particularly effective
for expensive black-box functions where gradient information is not available.

The algorithm requires finite bounds for all parameters.

The bayes_opt wrapper preserves the default parameter values from the underlying
BayesianOptimization package where appropriate.

bayes_opt supports the following options:

- **init_points** (PositiveInt): Number of random exploration points to evaluate before
starting optimization. Default is 5.

- **n_iter** (PositiveInt): Number of Bayesian optimization iterations to perform after
the initial random exploration. Default is 25.

- **verbose** (Literal[0, 1, 2]): Verbosity level from 0 (silent) to 2 (most verbose). Default is 0.

- **kappa** (NonNegativeFloat): Parameter to balance exploration versus exploitation trade-off
for the Upper Confidence Bound acquisition function. Higher values mean more exploration.
This parameter is only used if the acquisition function is set to "ucb" or "upper_confidence_bound"
and when a configured instance of an AcquisitionFunction object is not passed. Default is 2.576.

- **xi** (PositiveFloat): Parameter to balance exploration versus exploitation trade-off
for the Expected Improvement or Probability of Improvement acquisition functions.
Higher values mean more exploration. This parameter is only used if the acquisition function
is set to "ei", "expected_improvement", "poi", or "probability_of_improvement"
and when a configured instance of an AcquisitionFunction object is not passed. Default is 0.01.

- **exploration_decay** (float | None): Rate at which exploration decays over time.
Default is None (no decay).

- **exploration_decay_delay** (NonNegativeInt | None): Delay for decay. If None,
decay is applied from the start. Default is None.

- **random_state** (int | None): Random seed for reproducible results. Default is None.

- **acquisition_function** (str | AcquisitionFunction | Type[AcquisitionFunction] | None): Strategy for selecting
the next evaluation point. Options include:
- "ucb" or "upper_confidence_bound": Upper Confidence Bound
- "ei" or "expected_improvement": Expected Improvement
- "poi" or "probability_of_improvement": Probability of Improvement
Default is None (uses package default).

- **allow_duplicate_points** (bool): Whether to allow re-evaluation of the same point.
Default is False.

- **enable_sdr** (bool): Enable Sequential Domain Reduction, which progressively
narrows the search space around promising regions. Default is False.

- **sdr_gamma_osc** (float): Oscillation parameter for SDR. Default is 0.7.

- **sdr_gamma_pan** (float): Panning parameter for SDR. Default is 1.0.

- **sdr_eta** (float): Zooming parameter for SDR. Default is 0.9.

- **sdr_minimum_window** (NonNegativeFloat): Minimum window size for SDR. Default is 0.0.

- **alpha** (float): Noise parameter for the Gaussian Process. Default is 1e-6.

- **n_restarts** (int): Number of times to restart the optimizer. Default is 1.
```

## References

```{eval-rst}
Expand Down
27 changes: 27 additions & 0 deletions docs/source/refs.bib
Original file line number Diff line number Diff line change
Expand Up @@ -927,4 +927,31 @@ @InProceedings{Zambrano2013
doi = {10.1109/CEC.2013.6557848},
}

@Misc{Nogueira2014,
author={Fernando Nogueira},
title={{Bayesian Optimization}: Open source constrained global optimization tool for {Python}},
year={2014--},
url="https://github.com/bayesian-optimization/BayesianOptimization"
}

@article{Stander2002,
author={Stander, Nielen and Craig, Kenneth},
year={2002},
month={06},
pages={},
title={On the robustness of a simple domain reduction scheme for simulation-based optimization},
volume={19},
journal={International Journal for Computer-Aided Engineering and Software (Eng. Comput.)},
doi={10.1108/02644400210430190}
}

@inproceedings{gardner2014bayesian,
title={Bayesian optimization with inequality constraints.},
author={Gardner, Jacob R and Kusner, Matt J and Xu, Zhixiang Eddie and Weinberger, Kilian Q and Cunningham, John P},
booktitle={ICML},
volume={2014},
pages={937--945},
year={2014}
}

@Comment{jabref-meta: databaseType:bibtex;}
1 change: 1 addition & 0 deletions environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@ dependencies:
- annotated-types # dev, tests
- iminuit # dev, tests
- pip: # dev, tests, docs
- bayesian-optimization>=2.0.4 # dev, tests
- nevergrad # dev, tests
- DFO-LS>=1.5.3 # dev, tests
- Py-BOBYQA # dev, tests
Expand Down
1 change: 0 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,6 @@ dependencies = [
"sqlalchemy>=1.3",
"annotated-types",
"typing-extensions",
"iminuit",
]
dynamic = ["version"]
keywords = [
Expand Down
17 changes: 17 additions & 0 deletions src/optimagic/algorithms.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
from typing import Type, cast

from optimagic.optimization.algorithm import Algorithm
from optimagic.optimizers.bayesian_optimizer import BayesOpt
from optimagic.optimizers.bhhh import BHHH
from optimagic.optimizers.fides import Fides
from optimagic.optimizers.iminuit_migrad import IminuitMigrad
Expand Down Expand Up @@ -366,6 +367,7 @@ def Scalar(self) -> BoundedGlobalGradientFreeNonlinearConstrainedScalarAlgorithm

@dataclass(frozen=True)
class BoundedGlobalGradientFreeScalarAlgorithms(AlgoSelection):
bayes_opt: Type[BayesOpt] = BayesOpt
nevergrad_pso: Type[NevergradPSO] = NevergradPSO
nlopt_crs2_lm: Type[NloptCRS2LM] = NloptCRS2LM
nlopt_direct: Type[NloptDirect] = NloptDirect
Expand Down Expand Up @@ -1032,6 +1034,7 @@ def Local(self) -> GradientBasedLocalNonlinearConstrainedScalarAlgorithms:

@dataclass(frozen=True)
class BoundedGlobalGradientFreeAlgorithms(AlgoSelection):
bayes_opt: Type[BayesOpt] = BayesOpt
nevergrad_pso: Type[NevergradPSO] = NevergradPSO
nlopt_crs2_lm: Type[NloptCRS2LM] = NloptCRS2LM
nlopt_direct: Type[NloptDirect] = NloptDirect
Expand Down Expand Up @@ -1096,6 +1099,7 @@ def Scalar(self) -> GlobalGradientFreeNonlinearConstrainedScalarAlgorithms:

@dataclass(frozen=True)
class GlobalGradientFreeScalarAlgorithms(AlgoSelection):
bayes_opt: Type[BayesOpt] = BayesOpt
nevergrad_pso: Type[NevergradPSO] = NevergradPSO
nlopt_crs2_lm: Type[NloptCRS2LM] = NloptCRS2LM
nlopt_direct: Type[NloptDirect] = NloptDirect
Expand Down Expand Up @@ -1305,6 +1309,7 @@ def Scalar(self) -> BoundedGradientFreeNonlinearConstrainedScalarAlgorithms:

@dataclass(frozen=True)
class BoundedGradientFreeScalarAlgorithms(AlgoSelection):
bayes_opt: Type[BayesOpt] = BayesOpt
nag_pybobyqa: Type[NagPyBOBYQA] = NagPyBOBYQA
nevergrad_pso: Type[NevergradPSO] = NevergradPSO
nlopt_bobyqa: Type[NloptBOBYQA] = NloptBOBYQA
Expand Down Expand Up @@ -1529,6 +1534,7 @@ def Scalar(self) -> BoundedGlobalNonlinearConstrainedScalarAlgorithms:

@dataclass(frozen=True)
class BoundedGlobalScalarAlgorithms(AlgoSelection):
bayes_opt: Type[BayesOpt] = BayesOpt
nevergrad_pso: Type[NevergradPSO] = NevergradPSO
nlopt_crs2_lm: Type[NloptCRS2LM] = NloptCRS2LM
nlopt_direct: Type[NloptDirect] = NloptDirect
Expand Down Expand Up @@ -2141,6 +2147,7 @@ def Local(self) -> GradientBasedLikelihoodLocalAlgorithms:

@dataclass(frozen=True)
class GlobalGradientFreeAlgorithms(AlgoSelection):
bayes_opt: Type[BayesOpt] = BayesOpt
nevergrad_pso: Type[NevergradPSO] = NevergradPSO
nlopt_crs2_lm: Type[NloptCRS2LM] = NloptCRS2LM
nlopt_direct: Type[NloptDirect] = NloptDirect
Expand Down Expand Up @@ -2227,6 +2234,7 @@ def Scalar(self) -> GradientFreeLocalScalarAlgorithms:

@dataclass(frozen=True)
class BoundedGradientFreeAlgorithms(AlgoSelection):
bayes_opt: Type[BayesOpt] = BayesOpt
nag_dfols: Type[NagDFOLS] = NagDFOLS
nag_pybobyqa: Type[NagPyBOBYQA] = NagPyBOBYQA
nevergrad_pso: Type[NevergradPSO] = NevergradPSO
Expand Down Expand Up @@ -2324,6 +2332,7 @@ def Scalar(self) -> GradientFreeNonlinearConstrainedScalarAlgorithms:

@dataclass(frozen=True)
class GradientFreeScalarAlgorithms(AlgoSelection):
bayes_opt: Type[BayesOpt] = BayesOpt
nag_pybobyqa: Type[NagPyBOBYQA] = NagPyBOBYQA
neldermead_parallel: Type[NelderMeadParallel] = NelderMeadParallel
nevergrad_pso: Type[NevergradPSO] = NevergradPSO
Expand Down Expand Up @@ -2447,6 +2456,7 @@ def Scalar(self) -> GradientFreeParallelScalarAlgorithms:

@dataclass(frozen=True)
class BoundedGlobalAlgorithms(AlgoSelection):
bayes_opt: Type[BayesOpt] = BayesOpt
nevergrad_pso: Type[NevergradPSO] = NevergradPSO
nlopt_crs2_lm: Type[NloptCRS2LM] = NloptCRS2LM
nlopt_direct: Type[NloptDirect] = NloptDirect
Expand Down Expand Up @@ -2529,6 +2539,7 @@ def Scalar(self) -> GlobalNonlinearConstrainedScalarAlgorithms:

@dataclass(frozen=True)
class GlobalScalarAlgorithms(AlgoSelection):
bayes_opt: Type[BayesOpt] = BayesOpt
nevergrad_pso: Type[NevergradPSO] = NevergradPSO
nlopt_crs2_lm: Type[NloptCRS2LM] = NloptCRS2LM
nlopt_direct: Type[NloptDirect] = NloptDirect
Expand Down Expand Up @@ -2843,6 +2854,7 @@ def Scalar(self) -> BoundedNonlinearConstrainedScalarAlgorithms:

@dataclass(frozen=True)
class BoundedScalarAlgorithms(AlgoSelection):
bayes_opt: Type[BayesOpt] = BayesOpt
fides: Type[Fides] = Fides
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
ipopt: Type[Ipopt] = Ipopt
Expand Down Expand Up @@ -3155,6 +3167,7 @@ def Scalar(self) -> GradientBasedScalarAlgorithms:

@dataclass(frozen=True)
class GradientFreeAlgorithms(AlgoSelection):
bayes_opt: Type[BayesOpt] = BayesOpt
nag_dfols: Type[NagDFOLS] = NagDFOLS
nag_pybobyqa: Type[NagPyBOBYQA] = NagPyBOBYQA
neldermead_parallel: Type[NelderMeadParallel] = NelderMeadParallel
Expand Down Expand Up @@ -3229,6 +3242,7 @@ def Scalar(self) -> GradientFreeScalarAlgorithms:

@dataclass(frozen=True)
class GlobalAlgorithms(AlgoSelection):
bayes_opt: Type[BayesOpt] = BayesOpt
nevergrad_pso: Type[NevergradPSO] = NevergradPSO
nlopt_crs2_lm: Type[NloptCRS2LM] = NloptCRS2LM
nlopt_direct: Type[NloptDirect] = NloptDirect
Expand Down Expand Up @@ -3358,6 +3372,7 @@ def Scalar(self) -> LocalScalarAlgorithms:

@dataclass(frozen=True)
class BoundedAlgorithms(AlgoSelection):
bayes_opt: Type[BayesOpt] = BayesOpt
fides: Type[Fides] = Fides
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
ipopt: Type[Ipopt] = Ipopt
Expand Down Expand Up @@ -3495,6 +3510,7 @@ def Scalar(self) -> NonlinearConstrainedScalarAlgorithms:

@dataclass(frozen=True)
class ScalarAlgorithms(AlgoSelection):
bayes_opt: Type[BayesOpt] = BayesOpt
fides: Type[Fides] = Fides
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
ipopt: Type[Ipopt] = Ipopt
Expand Down Expand Up @@ -3671,6 +3687,7 @@ def Scalar(self) -> ParallelScalarAlgorithms:

@dataclass(frozen=True)
class Algorithms(AlgoSelection):
bayes_opt: Type[BayesOpt] = BayesOpt
bhhh: Type[BHHH] = BHHH
fides: Type[Fides] = Fides
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
Expand Down
8 changes: 8 additions & 0 deletions src/optimagic/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -108,6 +108,14 @@
IS_NEVERGRAD_INSTALLED = True


try:
from bayes_opt import BayesianOptimization # noqa: F401
except ImportError:
IS_BAYESOPT_INSTALLED = False
else:
IS_BAYESOPT_INSTALLED = True


# ======================================================================================
# Check if pandas version is newer or equal to version 2.1.0
# ======================================================================================
Expand Down
Loading
Loading