Masked Random Noise for Communication Efficient Federaetd Learning

التفاصيل البيبلوغرافية
العنوان: Masked Random Noise for Communication Efficient Federaetd Learning
المؤلفون: Li, Shiwei, Cheng, Yingyi, Wang, Haozhao, Tang, Xing, Xu, Shijie, Luo, Weihong, Li, Yuhua, Liu, Dugang, He, Xiuqiang, Li, and Ruixuan
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Distributed, Parallel, and Cluster Computing
الوصف: Federated learning is a promising distributed training paradigm that effectively safeguards data privacy. However, it may involve significant communication costs, which hinders training efficiency. In this paper, we aim to enhance communication efficiency from a new perspective. Specifically, we request the distributed clients to find optimal model updates relative to global model parameters within predefined random noise. For this purpose, we propose Federated Masked Random Noise (FedMRN), a novel framework that enables clients to learn a 1-bit mask for each model parameter and apply masked random noise (i.e., the Hadamard product of random noise and masks) to represent model updates. To make FedMRN feasible, we propose an advanced mask training strategy, called progressive stochastic masking (PSM). After local training, each client only need to transmit local masks and a random seed to the server. Additionally, we provide theoretical guarantees for the convergence of FedMRN under both strongly convex and non-convex assumptions. Extensive experiments are conducted on four popular datasets. The results show that FedMRN exhibits superior convergence speed and test accuracy compared to relevant baselines, while attaining a similar level of accuracy as FedAvg.
Comment: Accepted by MM 2024
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2408.03220
رقم الأكسشن: edsarx.2408.03220
قاعدة البيانات: arXiv