Bayesian Algorithm Execution for Multi-Objective Optimization with Expensive Simulations
BAX uses neural network surrogate models to efficiently find Pareto-optimal solutions when simulations are expensive. You provide 3 simple functions (oracles, objectives, algorithm) and BAX handles surrogate training, acquisition, and iterative optimization.
# Install dependencies
pip install uv
cd DAMA-BAX
uv sync
# Activate virtual environment
source .venv/bin/activate # Linux/Mac
# .venv\Scripts\activate # Windows
# Verify installation
python verify.pycd DAMA-BAX
# Create virtual environment
python -m venv .venv
# Activate it
source .venv/bin/activate # Linux/Mac
# .venv\Scripts\activate # Windows
# Install
pip install -e .
# Verify
python verify.py# Run the simplest example (takes ~30 seconds)
python run.py --case examples/synthetic_simple --max-iter 5You should see:
- Initial data generation
- Neural network training progress (net_0 and net_1)
- BAX iterations
- Models trained during the run saved to
./models_simple/
Here's all you need to use BAX (from examples/synthetic_simple/run.py):
from bax_core import run_bax_optimization
import numpy as np
# 1. Define oracle functions (your expensive simulations)
def oracle_obj1(X):
"""Sphere function, normalized to [0, 1] for sigmoid output."""
Y = np.sum(X**2, axis=1)
return (Y / 2.0).reshape(-1, 1) # MUST return (n, 1) shape!
def oracle_obj2(X):
"""Rosenbrock function, normalized to [0, 1] for sigmoid output."""
x1, x2 = X[:, 0], X[:, 1]
Y = (1 - x1)**2 + 100 * (x2 - x1**2)**2
return (Y / 100.0).reshape(-1, 1) # MUST return (n, 1) shape!
# 2. Define objective functions (convert predictions → objectives)
def objective_obj1(x, fn_model):
return fn_model(x) # Already (n, 1), just return it!
def objective_obj2(x, fn_model):
return fn_model(x) # Already (n, 1), just return it!
# 3. Define algorithm (acquisition strategy)
def algo(fn_model_list):
# Random sampling (simple but effective!)
candidates = np.random.rand(50, 2)
return candidates, candidates
# That's it! Run optimization:
opt, results = run_bax_optimization(
oracles=[oracle_obj1, oracle_obj2],
objectives=[objective_obj1, objective_obj2],
algorithm=algo, # No wrapper needed!
n_init=50, # Initial samples (automatic)
max_iterations=100
)Important Notes
- Your case directory must contain a file named
run.pywithget_bax_config(args)function, if you'd like to use the unified launcher - Oracle functions MUST return shape
(n, 1), NOT(n,) - The prediction functions (
fn_model) are guaranteed to return shape(n, 1)as well - Neural networks use sigmoid output activation by default, constraining predictions to
[0, 1]. Make sure your oracle functions return values in this range (normalize if needed) -- or you can modifycore/da_NN.pyto add output layer without sigmoid
| Example | Complexity | What it demonstrates | Run command |
|---|---|---|---|
| synthetic_simple | Starter | Basic 3-function API, random sampling | python run.py --case examples/synthetic_simple |
| synthetic | Intermediate | Grid expansion, custom initialization | python run.py --case examples/synthetic --max-iter 5 |
| dama | Advanced | Particle accelerator optimization, NSGA2 + boundary sampling | python run.py --case examples/dama --run-id 3 --max-iter 100 |
# Simple: Direct evaluation
python run.py --case examples/synthetic_simple
# Grid expansion pattern
python run.py --case examples/synthetic --max-iter 5
# Full application (requires pretrained models in examples/dama/resources/)
python run.py --case examples/dama --run-id 3 --max-iter 100
# Custom parameters
python run.py --case examples/synthetic \
--max-iter 10 \
--n-sampling 20 \
--nn-neurons 400 \
--seed 42mkdir my_optimization
cd my_optimizationREQUIRED: The file must be named run.py.
import numpy as np
from bax_core import run_bax_optimization
def oracle_obj1(X):
# Your expensive simulation here
Y = your_simulation_1(X)
# IMPORTANT: Normalize to [0, 1] for sigmoid output!
Y = Y / max_expected_value_1
# CRITICAL: Must return (n, 1) shape!
return Y.reshape(-1, 1)
def oracle_obj2(X):
# Another expensive simulation
Y = your_simulation_2(X)
Y = Y / max_expected_value_2 # Normalize!
return Y.reshape(-1, 1) # CRITICAL: (n, 1) shape!
def objective_obj1(x, fn_model):
predictions = fn_model(x)
return your_metric_calculation(predictions)
def objective_obj2(x, fn_model):
predictions = fn_model(x)
return your_other_metric(predictions)
def algo(fn_model_list):
# Your acquisition strategy (GA, Bayesian opt, random, etc.)
candidates = your_optimization_method(fn_model_list)
return candidates, candidates
def get_bax_config(args):
"""Entry point for unified runner."""
return {
'oracles': [oracle_obj1, oracle_obj2],
'objectives': [objective_obj1, objective_obj2],
'algorithm': algo,
'model_root': f'./models_run_{args.run_id}/' if args.run_id else './models/',
}python /path/to/DAMA-BAX/run.py --case ./my_optimization --max-iter 100See examples/synthetic_simple/run.py for a complete minimal template.
python run.py --case <directory> [options]
Core options:
--run-id N Run identifier for model/data directories
--max-iter N Maximum BAX iterations (default: 100)
--n-sampling N Points sampled per iteration (default: 50)
--n-init N Initial training samples (default: 100)
--device {auto,cuda,cpu} Compute device (default: auto)
--seed N Random seed
Neural network:
--nn-neurons N Network width (default: 800)
--nn-lr FLOAT Learning rate (default: 1e-4)
--nn-epochs N Initial training epochs (default: 150)
--nn-iter-epochs N Per-iteration epochs (default: 10)
--nn-batch-size N Batch size (default: 1000)
Training:
--test-ratio FLOAT Test set ratio (default: 0.05)
--weight-new FLOAT Weight for new data points (default: 10)
--snapshot / --no-snapshot Save models each iterationFor full control over initialization, normalization, and configuration, use the BAXOpt class directly:
from bax_core import BAXOpt
import da_NN as dann
# Manual initialization
X_init = generate_initial_samples(1000)
Y0_init = oracle_obj1(X_init)
Y1_init = oracle_obj2(X_init)
# Manual normalization
X_mu, X_std = dann.get_norm(X_init)
norm = lambda X: dann.normalize(X.copy(), X_mu, X_std)
# Create optimizer
opt = BAXOpt(
algo=make_algo(),
fn_oracle=[oracle_obj1, oracle_obj2],
norm=[norm, norm],
init=[lambda: (X_init, Y0_init), lambda: (X_init, Y1_init)],
device='cuda'
)
# Configure
opt.n_sampling = 50
opt.n_neur = 800
opt.epochs = 150
# Run
opt.run_acquisition(max_iterations=100)See examples/synthetic/run.py for complete example with grid expansion.
- Framework Guide - Complete guide with patterns and best practices
- API Reference - Quick API reference for both APIs
- Examples - Detailed examples documentation
- DAMA Example - Advanced full-featured example
- Contributing - Development guidelines
For troubleshooting, SLURM usage, and advanced topics, see the Framework Guide.