Compressible Dynamics in Deep Overparameterized Low-Rank Learning & Adaptation

التفاصيل البيبلوغرافية
العنوان: Compressible Dynamics in Deep Overparameterized Low-Rank Learning & Adaptation
المؤلفون: Yaras, Can, Wang, Peng, Balzano, Laura, Qu, Qing
سنة النشر: 2024
المجموعة: Computer Science
Statistics
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Artificial Intelligence, Electrical Engineering and Systems Science - Signal Processing, Statistics - Machine Learning
الوصف: While overparameterization in machine learning models offers great benefits in terms of optimization and generalization, it also leads to increased computational requirements as model sizes grow. In this work, we show that by leveraging the inherent low-dimensional structures of data and compressible dynamics within the model parameters, we can reap the benefits of overparameterization without the computational burdens. In practice, we demonstrate the effectiveness of this approach for deep low-rank matrix completion as well as fine-tuning language models. Our approach is grounded in theoretical findings for deep overparameterized low-rank matrix recovery, where we show that the learning dynamics of each weight matrix are confined to an invariant low-dimensional subspace. Consequently, we can construct and train compact, highly compressed factorizations possessing the same benefits as their overparameterized counterparts. In the context of deep matrix completion, our technique substantially improves training efficiency while retaining the advantages of overparameterization. For language model fine-tuning, we propose a method called "Deep LoRA", which improves the existing low-rank adaptation (LoRA) technique, leading to reduced overfitting and a simplified hyperparameter setup, while maintaining comparable efficiency. We validate the effectiveness of Deep LoRA on natural language tasks, particularly when fine-tuning with limited data. Our code is available at https://github.com/cjyaras/deep-lora-transformers.
Comment: Accepted at ICML'24 (Oral)
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2406.04112
رقم الأكسشن: edsarx.2406.04112
قاعدة البيانات: arXiv