-
Notifications
You must be signed in to change notification settings - Fork 446
Closed
Labels
Description
This is the current "Get Started" example on botorch.org:
# Fit a model
import torch
from botorch.models import SingleTaskGP
from botorch.fit import fit_gpytorch_mll
from botorch.utils import standardize
from gpytorch.mlls import ExactMarginalLogLikelihood
train_X = torch.rand(10, 2)
Y = 1 - torch.linalg.norm(train_X - 0.5, dim=-1, keepdim=True)
Y = Y + 0.1 * torch.randn_like(Y) # add some noise
train_Y = standardize(Y)
gp = SingleTaskGP(train_X, train_Y)
mll = ExactMarginalLogLikelihood(gp.likelihood, gp)
fit_gpytorch_mll(mll)
# Construct an acquisition function
from botorch.acquisition import UpperConfidenceBound
UCB = UpperConfidenceBound(gp, beta=0.1)
# Optimize the acquisition function
bounds = torch.stack([torch.zeros(2), torch.ones(2)])
candidate, acq_value = optimize_acqf(
UCB, bounds=bounds, q=1, num_restarts=5, raw_samples=20,
)
candidate # tensor([0.4887, 0.5063])
This code doesn't follow best practices in a few ways:
- It uses
standardizerather than a transform, so predictions will be in the transformed space, which can be hard to interpret. TheStandardizeoutcome transform would be better. - It doesn't illustrate use of the
Normalizeinput transform. I appreciate the concision of this example, but I think it's oversimplified; a user who runs this code on different data will get a warning telling them to normalize and not know how to do it. - The data and bounds are in single precision, which will also generate a warning.
- We're more likely to recommend an acquisition function in the LogEI family than UCB.
Ideally, all references to standardize would be removed and there would be an audit of tutorials and other documentation for adherence to current best practices, but fixing the landing page would be a great improvement.