تقرير
How Free is Parameter-Free Stochastic Optimization?
العنوان: | How Free is Parameter-Free Stochastic Optimization? |
---|---|
المؤلفون: | Attia, Amit, Koren, Tomer |
سنة النشر: | 2024 |
المجموعة: | Computer Science Mathematics Statistics |
مصطلحات موضوعية: | Computer Science - Machine Learning, Mathematics - Optimization and Control, Statistics - Machine Learning |
الوصف: | We study the problem of parameter-free stochastic optimization, inquiring whether, and under what conditions, do fully parameter-free methods exist: these are methods that achieve convergence rates competitive with optimally tuned methods, without requiring significant knowledge of the true problem parameters. Existing parameter-free methods can only be considered ``partially'' parameter-free, as they require some non-trivial knowledge of the true problem parameters, such as a bound on the stochastic gradient norms, a bound on the distance to a minimizer, etc. In the non-convex setting, we demonstrate that a simple hyperparameter search technique results in a fully parameter-free method that outperforms more sophisticated state-of-the-art algorithms. We also provide a similar result in the convex setting with access to noisy function values under mild noise assumptions. Finally, assuming only access to stochastic gradients, we establish a lower bound that renders fully parameter-free stochastic convex optimization infeasible, and provide a method which is (partially) parameter-free up to the limit indicated by our lower bound. Comment: 28 pages |
نوع الوثيقة: | Working Paper |
URL الوصول: | http://arxiv.org/abs/2402.03126 |
رقم الأكسشن: | edsarx.2402.03126 |
قاعدة البيانات: | arXiv |
الوصف غير متاح. |