FedPara: Low-Rank Hadamard Product for Communication-Efficient Federated Learning

التفاصيل البيبلوغرافية
العنوان: FedPara: Low-Rank Hadamard Product for Communication-Efficient Federated Learning
المؤلفون: Hyeon-Woo, Nam, Ye-Bin, Moon, Oh, Tae-Hyun
سنة النشر: 2021
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Computer Vision and Pattern Recognition
الوصف: In this work, we propose a communication-efficient parameterization, FedPara, for federated learning (FL) to overcome the burdens on frequent model uploads and downloads. Our method re-parameterizes weight parameters of layers using low-rank weights followed by the Hadamard product. Compared to the conventional low-rank parameterization, our FedPara method is not restricted to low-rank constraints, and thereby it has a far larger capacity. This property enables to achieve comparable performance while requiring 3 to 10 times lower communication costs than the model with the original layers, which is not achievable by the traditional low-rank methods. The efficiency of our method can be further improved by combining with other efficient FL optimizers. In addition, we extend our method to a personalized FL application, pFedPara, which separates parameters into global and local ones. We show that pFedPara outperforms competing personalized FL methods with more than three times fewer parameters.
Comment: Accepted at ICLR 2022
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2108.06098
رقم الأكسشن: edsarx.2108.06098
قاعدة البيانات: arXiv