Skip to content

Commit b5a5c96

Browse files
authored
Merge pull request #308 from xopt-org/turbo-documentation
Turbo documentation
2 parents 040e714 + 28c7989 commit b5a5c96

File tree

6 files changed

+716
-15
lines changed

6 files changed

+716
-15
lines changed

docs/examples/multi_objective_bayes_opt/mobo.ipynb

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -54,6 +54,7 @@
5454
"N_MC_SAMPLES = 1 if SMOKE_TEST else 128\n",
5555
"NUM_RESTARTS = 1 if SMOKE_TEST else 20\n",
5656
"N_STEPS = 1 if SMOKE_TEST else 30\n",
57+
"MAX_ITER = 1 if SMOKE_TEST else 200\n",
5758
"\n",
5859
"evaluator = Evaluator(function=evaluate_TNK)\n",
5960
"print(tnk_vocs.dict())"
@@ -75,8 +76,10 @@
7576
"generator = MOBOGenerator(vocs=tnk_vocs, reference_point={\"y1\": 1.5, \"y2\": 1.5})\n",
7677
"generator.n_monte_carlo_samples = N_MC_SAMPLES\n",
7778
"generator.numerical_optimizer.n_restarts = NUM_RESTARTS\n",
79+
"generator.numerical_optimizer.max_iter = MAX_ITER\n",
7880
"generator.gp_constructor.use_low_noise_prior = True\n",
7981
"\n",
82+
"\n",
8083
"X = Xopt(generator=generator, evaluator=evaluator, vocs=tnk_vocs)\n",
8184
"X.evaluate_data(pd.DataFrame({\"x1\": [1.0, 0.75], \"x2\": [0.75, 1.0]}))\n",
8285
"\n",
Lines changed: 232 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,232 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"metadata": {},
6+
"source": [
7+
"# Basics of Trust Region Controllers in Xopt"
8+
]
9+
},
10+
{
11+
"cell_type": "markdown",
12+
"metadata": {},
13+
"source": [
14+
"Trust Region Bayesian Optimization (TuRBO) is an advanced optimization algorithm designed for solving high-dimensional black-box optimization problems. It combines the strengths of Bayesian Optimization (BO) with trust region methods to improve scalability and efficiency.\n",
15+
"\n",
16+
"### Key Features:\n",
17+
"1. **Trust Regions**:\n",
18+
" - TuRBO uses local trust regions to focus the search in promising areas of the parameter space.\n",
19+
" - Each trust region is a bounded subspace where the optimization is performed, and its size is dynamically adjusted based on the success of the optimization.\n",
20+
"\n",
21+
"2. **Bayesian Surrogate Model**:\n",
22+
" - A Gaussian Process (GP) or other surrogate models are used to approximate the objective function.\n",
23+
" - This surrogate model is used to predict the objective function and guide the search as well as define the size of the trust region.\n",
24+
"\n",
25+
"4. **Adaptivity**:\n",
26+
" - The algorithm adapts the size of the trust region based on the success or failure of the optimization steps. If the optimization within a trust region is successful, the region expands; otherwise, it shrinks.\n",
27+
"\n",
28+
"### Advantages:\n",
29+
"- Scales better to high-dimensional problems compared to standard Bayesian Optimization.\n",
30+
"- Efficiently balances exploration and exploitation within trust regions.\n",
31+
"\n",
32+
"### Disadvantages:\n",
33+
"- Severely restricts exploration of the parameter space potentially leading to convergence to local minima, thus making it sensitive to initial sampling points.\n",
34+
"- Introduces additional algorithm hyperparameters which can cause issues. \n",
35+
"- May struggle with noisy objective functions or discontinuous landscapes. "
36+
]
37+
},
38+
{
39+
"cell_type": "markdown",
40+
"metadata": {},
41+
"source": [
42+
"## Defining a TuRBO Controller\n",
43+
"Currently, Xopt supports 3 different TuRBO controller types, the most basic of which is the `OptimizeTurboController`. To create this controller we need to define our optimization problem and some data."
44+
]
45+
},
46+
{
47+
"cell_type": "code",
48+
"execution_count": null,
49+
"metadata": {},
50+
"outputs": [],
51+
"source": [
52+
"import numpy as np\n",
53+
"from xopt import VOCS\n",
54+
"import pandas as pd\n",
55+
"\n",
56+
"\n",
57+
"# create evaluation function\n",
58+
"def sphere_function(inputs):\n",
59+
" \"\"\"\n",
60+
" 2D Sphere objective function.\n",
61+
" Compatible with Xopt.\n",
62+
" \"\"\"\n",
63+
" x, y = inputs[\"x\"], inputs[\"y\"]\n",
64+
" return {\"f\": np.sum(np.square(np.stack([x, y], axis=-1)), axis=-1)}\n",
65+
"\n",
66+
"\n",
67+
"# create a VOCS object\n",
68+
"vocs = VOCS(\n",
69+
" variables={\"x\": {-5, 5}, \"y\": {-5, 5}},\n",
70+
" objectives={\"f\": \"MINIMIZE\"},\n",
71+
")\n",
72+
"\n",
73+
"# random sample 10 points\n",
74+
"x0 = vocs.random_inputs(10)\n",
75+
"\n",
76+
"# evaluate the function at the random points\n",
77+
"f = []\n",
78+
"for i in range(len(x0)):\n",
79+
" f += [sphere_function(x0[i]) | x0[i]]\n",
80+
"\n",
81+
"# print the results\n",
82+
"data = pd.DataFrame(f)\n",
83+
"data"
84+
]
85+
},
86+
{
87+
"cell_type": "markdown",
88+
"metadata": {},
89+
"source": [
90+
"## Create the ExpectedImprovementGenerator and train the GP model\n",
91+
"Here we create the ExpectedImprovementGenerator, add data to the generator, and train the model from the data."
92+
]
93+
},
94+
{
95+
"cell_type": "code",
96+
"execution_count": null,
97+
"metadata": {},
98+
"outputs": [],
99+
"source": [
100+
"from xopt.generators.bayesian import ExpectedImprovementGenerator\n",
101+
"\n",
102+
"generator = ExpectedImprovementGenerator(vocs=vocs) # create the generator\n",
103+
"generator.gp_constructor.use_low_noise_prior = True\n",
104+
"generator.add_data(data) # add the data to the generator\n",
105+
"generator.train_model() # train the model"
106+
]
107+
},
108+
{
109+
"cell_type": "markdown",
110+
"metadata": {},
111+
"source": [
112+
"## Create the Optimize Turbo Controller\n",
113+
"Here we create the controller and view the different parameters with their descriptions."
114+
]
115+
},
116+
{
117+
"cell_type": "code",
118+
"execution_count": null,
119+
"metadata": {},
120+
"outputs": [],
121+
"source": [
122+
"from xopt.generators.bayesian.turbo import OptimizeTurboController\n",
123+
"\n",
124+
"turbo_controller = OptimizeTurboController(vocs=vocs)\n",
125+
"\n",
126+
"print(turbo_controller.__doc__)\n",
127+
"print(\"-\" * 20)\n",
128+
"\n",
129+
"# examine the attributes of the controller\n",
130+
"for field_name, field in turbo_controller.model_fields.items():\n",
131+
" print(f\"Field: {field_name}\")\n",
132+
" print(f\" Description: {field.description}\")\n",
133+
" print(f\" Type: {field.annotation}\")\n",
134+
" print(f\" Default: {field.default}\")\n",
135+
" print(f\" Value: {getattr(turbo_controller, field_name)}\")\n",
136+
" print(\"-\" * 20)"
137+
]
138+
},
139+
{
140+
"cell_type": "markdown",
141+
"metadata": {},
142+
"source": [
143+
"## Getting the Trust Region\n",
144+
"Here we get the current trust region \n"
145+
]
146+
},
147+
{
148+
"cell_type": "code",
149+
"execution_count": null,
150+
"metadata": {},
151+
"outputs": [],
152+
"source": [
153+
"trust_region = turbo_controller.get_trust_region(\n",
154+
" generator=generator\n",
155+
") # get the trust region of the model\n",
156+
"print(f\"Trust Region: {trust_region}\")"
157+
]
158+
},
159+
{
160+
"cell_type": "markdown",
161+
"metadata": {},
162+
"source": [
163+
"### Update the trust region\n",
164+
"Add another data point to the generator (as if we performed one optimization step) and update the turbo controller. We will add a point that improves over the best function value measured so far so this measurement will count as a success."
165+
]
166+
},
167+
{
168+
"cell_type": "code",
169+
"execution_count": null,
170+
"metadata": {},
171+
"outputs": [],
172+
"source": [
173+
"# add a new point to the generator\n",
174+
"new_point = pd.DataFrame({\"x\": [0.0], \"y\": [0.0], \"f\": [0.0]})\n",
175+
"generator.add_data(new_point) # add the new point to the generator"
176+
]
177+
},
178+
{
179+
"cell_type": "code",
180+
"execution_count": null,
181+
"metadata": {},
182+
"outputs": [],
183+
"source": [
184+
"generator.train_model() # train the model again\n",
185+
"\n",
186+
"# update the TuRBO controller\n",
187+
"turbo_controller.update_state(generator)\n",
188+
"\n",
189+
"# get the new trust region\n",
190+
"trust_region = turbo_controller.get_trust_region(\n",
191+
" generator=generator\n",
192+
") # get the trust region of the model\n",
193+
"print(f\"New Trust Region: {trust_region}\")\n",
194+
"\n",
195+
"# get the number of successes and failures\n",
196+
"print(f\"Number of successes: {turbo_controller.success_counter}\")\n",
197+
"print(f\"Number of failures: {turbo_controller.failure_counter}\")\n",
198+
"\n",
199+
"# get the base length scale of the trust region\n",
200+
"print(f\"Base length scale: {turbo_controller.length}\")"
201+
]
202+
},
203+
{
204+
"cell_type": "code",
205+
"execution_count": null,
206+
"metadata": {},
207+
"outputs": [],
208+
"source": []
209+
}
210+
],
211+
"metadata": {
212+
"kernelspec": {
213+
"display_name": "Python 3",
214+
"language": "python",
215+
"name": "python3"
216+
},
217+
"language_info": {
218+
"codemirror_mode": {
219+
"name": "ipython",
220+
"version": 3
221+
},
222+
"file_extension": ".py",
223+
"mimetype": "text/x-python",
224+
"name": "python",
225+
"nbconvert_exporter": "python",
226+
"pygments_lexer": "ipython3",
227+
"version": "3.12.9"
228+
}
229+
},
230+
"nbformat": 4,
231+
"nbformat_minor": 2
232+
}

docs/examples/single_objective_bayes_opt/turbo_tutorial.ipynb renamed to docs/examples/single_objective_bayes_opt/trust_region_bo/turbo_optimize.ipynb

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
}
1010
},
1111
"source": [
12-
"# TuRBO Bayesian Optimization\n",
12+
"# TuRBO Bayesian Optimization - Optimize\n",
1313
"In this tutorial we demonstrate the use of Xopt to preform Trust Region Bayesian\n",
1414
"Optimization (TuRBO) on a simple test problem. During optimization of high\n",
1515
"dimensional input spaces off the shelf BO tends to over-emphasize exploration which\n",
@@ -54,7 +54,7 @@
5454
"outputs": [],
5555
"source": [
5656
"from xopt.evaluator import Evaluator\n",
57-
"from xopt.generators.bayesian import UpperConfidenceBoundGenerator\n",
57+
"from xopt.generators.bayesian import ExpectedImprovementGenerator\n",
5858
"from xopt import Xopt\n",
5959
"from xopt.vocs import VOCS\n",
6060
"import math\n",
@@ -104,7 +104,7 @@
104104
"source": [
105105
"## Create Xopt objects\n",
106106
"Create the evaluator to evaluate our test function and create a generator that uses\n",
107-
"the Upper Confidence Bound acquisition function to perform Bayesian Optimization. Note that because we are optimizing a problem with no noise we set `use_low_noise_prior=True` in the GP model constructor."
107+
"the Upper Confidence Bound acquisition function to perform Bayesian Optimization."
108108
]
109109
},
110110
{
@@ -125,8 +125,9 @@
125125
"outputs": [],
126126
"source": [
127127
"evaluator = Evaluator(function=sin_function)\n",
128-
"generator = UpperConfidenceBoundGenerator(vocs=vocs, turbo_controller=\"optimize\")\n",
128+
"generator = ExpectedImprovementGenerator(vocs=vocs, turbo_controller=\"optimize\")\n",
129129
"generator.gp_constructor.use_low_noise_prior = True\n",
130+
"\n",
130131
"X = Xopt(evaluator=evaluator, generator=generator, vocs=vocs)"
131132
]
132133
},
@@ -295,7 +296,7 @@
295296
" ax[0].plot(test_x, true_f, \"--\", label=\"Ground truth\")\n",
296297
"\n",
297298
" # plot acquisition function\n",
298-
" ax[1].plot(test_x, acq_val.flatten())\n",
299+
" ax[1].plot(test_x, acq_val.flatten().exp())\n",
299300
"\n",
300301
" ax[0].set_ylabel(\"f\")\n",
301302
" ax[0].set_ylim(-12, 10)\n",

0 commit comments

Comments
 (0)