Rethinking the Starting Point: Collaborative Pre-Training for Federated Downstream Tasks

التفاصيل البيبلوغرافية
العنوان: Rethinking the Starting Point: Collaborative Pre-Training for Federated Downstream Tasks
المؤلفون: Chu, Yun-Wei, Han, Dong-Jun, Hosseinalipour, Seyyedali, Brinton, Christopher G.
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning
الوصف: A few recent studies have demonstrated that leveraging centrally pre-trained models can offer advantageous initializations for federated learning (FL). However, existing pre-training methods do not generalize well when faced with an arbitrary set of downstream FL tasks. Specifically, they often (i) achieve limited average accuracy, particularly when there are unseen downstream labels, and (ii) result in significant accuracy variance, failing to provide a balanced performance across clients. To address these challenges, we propose CoPreFL, a collaborative/distributed pre-training approach which provides a robust initialization for downstream FL tasks. The key idea of CoPreFL is a model-agnostic meta-learning (MAML) procedure that tailors the global model to closely mimic heterogeneous and unseen FL scenarios, resulting in a pre-trained model that is rapidly adaptable to arbitrary FL tasks. Our MAML procedure incorporates performance variance into the meta-objective function, balancing performance across clients rather than solely optimizing for accuracy. Through extensive experiments, we demonstrate that CoPreFL obtains significant improvements in both average accuracy and variance across arbitrary downstream FL tasks with unseen/seen labels, compared with various pre-training baselines. We also show how CoPreFL is compatible with different well-known FL algorithms applied by the downstream tasks, enhancing performance in each case.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2402.02225
رقم الأكسشن: edsarx.2402.02225
قاعدة البيانات: arXiv