Revisiting LARS for Large Batch Training Generalization of Neural Networks

التفاصيل البيبلوغرافية
العنوان: Revisiting LARS for Large Batch Training Generalization of Neural Networks
المؤلفون: Do, Khoi, Nguyen, Duong, Nguyen, Hoa, Tran-Thanh, Long, Tran, Nguyen-Hoang, Pham, Quoc-Viet
سنة النشر: 2023
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Artificial Intelligence
الوصف: This paper explores Large Batch Training techniques using layer-wise adaptive scaling ratio (LARS) across diverse settings, uncovering insights. LARS algorithms with warm-up tend to be trapped in sharp minimizers early on due to redundant ratio scaling. Additionally, a fixed steep decline in the latter phase restricts deep neural networks from effectively navigating early-phase sharp minimizers. Building on these findings, we propose Time Varying LARS (TVLARS), a novel algorithm that replaces warm-up with a configurable sigmoid-like function for robust training in the initial phase. TVLARS promotes gradient exploration early on, surpassing sharp optimizers and gradually transitioning to LARS for robustness in later phases. Extensive experiments demonstrate that TVLARS consistently outperforms LARS and LAMB in most cases, with up to 2\% improvement in classification scenarios. Notably, in all self-supervised learning cases, TVLARS dominates LARS and LAMB with performance improvements of up to 10\%.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2309.14053
رقم الأكسشن: edsarx.2309.14053
قاعدة البيانات: arXiv