A Theory on Adam Instability in Large-Scale Machine Learning

التفاصيل البيبلوغرافية
العنوان: A Theory on Adam Instability in Large-Scale Machine Learning
المؤلفون: Molybog, Igor, Albert, Peter, Chen, Moya, DeVito, Zachary, Esiobu, David, Goyal, Naman, Koura, Punit Singh, Narang, Sharan, Poulton, Andrew, Silva, Ruan, Tang, Binh, Liskovich, Diana, Xu, Puxin, Zhang, Yuchen, Kambadur, Melanie, Roller, Stephen, Zhang, Susan
سنة النشر: 2023
المجموعة: Computer Science
Mathematics
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Artificial Intelligence, Mathematics - Optimization and Control
الوصف: We present a theory for the previously unexplained divergent behavior noticed in the training of large language models. We argue that the phenomenon is an artifact of the dominant optimization algorithm used for training, called Adam. We observe that Adam can enter a state in which the parameter update vector has a relatively large norm and is essentially uncorrelated with the direction of descent on the training loss landscape, leading to divergence. This artifact is more likely to be observed in the training of a deep model with a large batch size, which is the typical setting of large-scale language model training. To argue the theory, we present observations from the training runs of the language models of different scales: 7 billion, 30 billion, 65 billion, and 546 billion parameters.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2304.09871
رقم الأكسشن: edsarx.2304.09871
قاعدة البيانات: arXiv