Correlated Quantization for Faster Nonconvex Distributed Optimization

التفاصيل البيبلوغرافية
العنوان: Correlated Quantization for Faster Nonconvex Distributed Optimization
المؤلفون: Panferov, Andrei, Demidovich, Yury, Rammal, Ahmad, Richtárik, Peter
سنة النشر: 2024
المجموعة: Computer Science
Mathematics
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Artificial Intelligence, Computer Science - Distributed, Parallel, and Cluster Computing, Mathematics - Optimization and Control
الوصف: Quantization (Alistarh et al., 2017) is an important (stochastic) compression technique that reduces the volume of transmitted bits during each communication round in distributed model training. Suresh et al. (2022) introduce correlated quantizers and show their advantages over independent counterparts by analyzing distributed SGD communication complexity. We analyze the forefront distributed non-convex optimization algorithm MARINA (Gorbunov et al., 2022) utilizing the proposed correlated quantizers and show that it outperforms the original MARINA and distributed SGD of Suresh et al. (2022) with regard to the communication complexity. We significantly refine the original analysis of MARINA without any additional assumptions using the weighted Hessian variance (Tyurin et al., 2022), and then we expand the theoretical framework of MARINA to accommodate a substantially broader range of potentially correlated and biased compressors, thus dilating the applicability of the method beyond the conventional independent unbiased compressor setup. Extensive experimental results corroborate our theoretical findings.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2401.05518
رقم الأكسشن: edsarx.2401.05518
قاعدة البيانات: arXiv