Reducing Impacts of System Heterogeneity in Federated Learning using Weight Update Magnitudes

التفاصيل البيبلوغرافية
العنوان: Reducing Impacts of System Heterogeneity in Federated Learning using Weight Update Magnitudes
المؤلفون: Wang, Irene
سنة النشر: 2022
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning
الوصف: The widespread adoption of handheld devices have fueled rapid growth in new applications. Several of these new applications employ machine learning models to train on user data that is typically private and sensitive. Federated Learning enables machine learning models to train locally on each handheld device while only synchronizing their neuron updates with a server. While this enables user privacy, technology scaling and software advancements have resulted in handheld devices with varying performance capabilities. This results in the training time of federated learning tasks to be dictated by a few low-performance straggler devices, essentially becoming a bottleneck to the entire training process. In this work, we aim to mitigate the performance bottleneck of federated learning by dynamically forming sub-models for stragglers based on their performance and accuracy feedback. To this end, we offer the Invariant Dropout, a dynamic technique that forms a sub-model based on the neuron update threshold. Invariant Dropout uses neuron updates from the non-straggler clients to develop a tailored sub-models for each straggler during each training iteration. All corresponding weights which have a magnitude less than the threshold are dropped for the iteration. We evaluate Invariant Dropout using five real-world mobile clients. Our evaluations show that Invariant Dropout obtains a maximum accuracy gain of 1.4% points over state-of-the-art Ordered Dropout while mitigating performance bottlenecks of stragglers.
Comment: Undergraduate thesis completed under advisor Dr. Prashant Nair and committee member Dr. Divya Mahajan
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2208.14808
رقم الأكسشن: edsarx.2208.14808
قاعدة البيانات: arXiv