Navigating Scaling Laws: Compute Optimality in Adaptive Model Training

التفاصيل البيبلوغرافية
العنوان: Navigating Scaling Laws: Compute Optimality in Adaptive Model Training
المؤلفون: Anagnostidis, Sotiris, Bachmann, Gregor, Schlag, Imanol, Hofmann, Thomas
سنة النشر: 2023
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Computer Vision and Pattern Recognition
الوصف: In recent years, the state-of-the-art in deep learning has been dominated by very large models that have been pre-trained on vast amounts of data. The paradigm is very simple: investing more computational resources (optimally) leads to better performance, and even predictably so; neural scaling laws have been derived that accurately forecast the performance of a network for a desired level of compute. This leads to the notion of a `compute-optimal' model, i.e. a model that allocates a given level of compute during training optimally to maximize performance. In this work, we extend the concept of optimality by allowing for an `adaptive' model, i.e. a model that can change its shape during training. By doing so, we can design adaptive models that optimally traverse between the underlying scaling laws and outpace their `static' counterparts, leading to a significant reduction in the required compute to reach a given target performance. We show that our approach generalizes across modalities and different shape parameters.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2311.03233
رقم الأكسشن: edsarx.2311.03233
قاعدة البيانات: arXiv