Investigator: Jonathan Scarlett.
Black-box function optimization via expensive noisy samples have diverse applications in problems such as hyperparameter tuning, robotics, and molecular design. Various algorithms have previously been proposed with rigorous performance bounds guaranteeing a near-optimal solution. We provided the first algorithm-independent lower bounds on performance, using tools from information theory. This allows certification of the degree of optimality of existing algorithms, and identification of the cases in which the greatest improvements remain possible.
Reference: Jonathan Scarlett, “Tight Regret Bounds for Bayesian Optimization in One Dimension,” International Conference on Machine Learning (ICML), 2018