AB-Training: A Communication-Efficient Approach for Distributed Low-Rank Learning

التفاصيل البيبلوغرافية
العنوان: AB-Training: A Communication-Efficient Approach for Distributed Low-Rank Learning
المؤلفون: Coquelin, Daniel, Flügel, Katherina, Weiel, Marie, Kiefer, Nicholas, Öz, Muhammed, Debus, Charlotte, Streit, Achim, Götz, Markus
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Artificial Intelligence, Computer Science - Distributed, Parallel, and Cluster Computing
الوصف: Communication bottlenecks severely hinder the scalability of distributed neural network training, particularly in high-performance computing (HPC) environments. We introduce AB-training, a novel data-parallel method that leverages low-rank representations and independent training groups to significantly reduce communication overhead. Our experiments demonstrate an average reduction in network traffic of approximately 70.31\% across various scaling scenarios, increasing the training potential of communication-constrained systems and accelerating convergence at scale. AB-training also exhibits a pronounced regularization effect at smaller scales, leading to improved generalization while maintaining or even reducing training time. We achieve a remarkable 44.14 : 1 compression ratio on VGG16 trained on CIFAR-10 with minimal accuracy loss, and outperform traditional data parallel training by 1.55\% on ResNet-50 trained on ImageNet-2012. While AB-training is promising, our findings also reveal that large batch effects persist even in low-rank regimes, underscoring the need for further research into optimized update mechanisms for massively distributed training.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2405.01067
رقم الأكسشن: edsarx.2405.01067
قاعدة البيانات: arXiv