Beta. Content is under active construction and has not been peer-reviewed. Report errors on
GitHub
.
Disclaimer
Theorem
Path
Curriculum
Paths
Demos
Diagnostic
Search
Quiz Hub
/
Bayesian Optimization for Hyperparameters
Bayesian Optimization for Hyperparameters
3 questions
Difficulty 4-5
View topic
Intermediate
0 / 3
3 intermediate
Adapts to your performance
1 / 3
intermediate (4/10)
state theorem
Bayesian optimization (BO) tunes expensive functions. What is the core idea?
Hide and think first
A.
Fit a probabilistic surrogate to past evaluations (typically Gaussian process), then pick next point by maximizing an acquisition function that balances exploitation and exploration
B.
Randomly sample hyperparameter combinations uniformly and return the best after a fixed budget
C.
Apply gradient descent directly on the validation loss with respect to hyperparameters
D.
Try a large grid of hyperparameter combinations in parallel and pick the best
Submit Answer