Adaptive Gradient Methods at the Edge of Stability

التفاصيل البيبلوغرافية
العنوان: Adaptive Gradient Methods at the Edge of Stability
المؤلفون: Cohen, Jeremy M., Ghorbani, Behrooz, Krishnan, Shankar, Agarwal, Naman, Medapati, Sourabh, Badura, Michal, Suo, Daniel, Cardoze, David, Nado, Zachary, Dahl, George E., Gilmer, Justin
سنة النشر: 2022
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning
الوصف: Very little is known about the training dynamics of adaptive gradient methods like Adam in deep learning. In this paper, we shed light on the behavior of these algorithms in the full-batch and sufficiently large batch settings. Specifically, we empirically demonstrate that during full-batch training, the maximum eigenvalue of the preconditioned Hessian typically equilibrates at a certain numerical value -- the stability threshold of a gradient descent algorithm. For Adam with step size $\eta$ and $\beta_1 = 0.9$, this stability threshold is $38/\eta$. Similar effects occur during minibatch training, especially as the batch size grows. Yet, even though adaptive methods train at the ``Adaptive Edge of Stability'' (AEoS), their behavior in this regime differs in a significant way from that of non-adaptive methods at the EoS. Whereas non-adaptive algorithms at the EoS are blocked from entering high-curvature regions of the loss landscape, adaptive gradient methods at the AEoS can keep advancing into high-curvature regions, while adapting the preconditioner to compensate. Our findings can serve as a foundation for the community's future understanding of adaptive gradient methods in deep learning.
Comment: v2 corrects the formula for Adam's preconditioner in Eq 2
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2207.14484
رقم الأكسشن: edsarx.2207.14484
قاعدة البيانات: arXiv