Accelerating Look-ahead in Bayesian Optimization: Multilevel Monte Carlo is All you Need

التفاصيل البيبلوغرافية
العنوان: Accelerating Look-ahead in Bayesian Optimization: Multilevel Monte Carlo is All you Need
المؤلفون: Yang, Shangda, Zankin, Vitaly, Balandat, Maximilian, Scherer, Stefan, Carlberg, Kevin, Walton, Neil, Law, Kody J. H.
سنة النشر: 2024
المجموعة: Computer Science
Mathematics
Statistics
مصطلحات موضوعية: Statistics - Machine Learning, Computer Science - Machine Learning, Mathematics - Optimization and Control, Mathematics - Probability, Statistics - Computation, Statistics - Methodology
الوصف: We leverage multilevel Monte Carlo (MLMC) to improve the performance of multi-step look-ahead Bayesian optimization (BO) methods that involve nested expectations and maximizations. Often these expectations must be computed by Monte Carlo (MC). The complexity rate of naive MC degrades for nested operations, whereas MLMC is capable of achieving the canonical MC convergence rate for this type of problem, independently of dimension and without any smoothness assumptions. Our theoretical study focuses on the approximation improvements for twoand three-step look-ahead acquisition functions, but, as we discuss, the approach is generalizable in various ways, including beyond the context of BO. Our findings are verified numerically and the benefits of MLMC for BO are illustrated on several benchmark examples. Code is available at https://github.com/Shangda-Yang/MLMCBO .
Comment: Preprint ICML 2024
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2402.02111
رقم الأكسشن: edsarx.2402.02111
قاعدة البيانات: arXiv