ColD Fusion: Collaborative Descent for Distributed Multitask Finetuning

التفاصيل البيبلوغرافية
العنوان: ColD Fusion: Collaborative Descent for Distributed Multitask Finetuning
المؤلفون: Don-Yehiya, Shachar, Venezian, Elad, Raffel, Colin, Slonim, Noam, Katz, Yoav, Choshen, Leshem
سنة النشر: 2022
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Computation and Language, Computer Science - Distributed, Parallel, and Cluster Computing
الوصف: We propose a new paradigm to continually evolve pretrained models, denoted ColD Fusion. It provides the benefits of multitask learning but leverages distributed computation with limited communication and eliminates the need for shared data. Consequentially, ColD Fusion can give rise to a synergistic loop, where finetuned models can be recycled to continually improve the pretrained model they are based upon. We show that ColD Fusion yields comparable benefits to multitask training by producing a model that (a) attains strong performance on all of the datasets it was trained on; and (b) is a better starting point for finetuning on unseen datasets. We show that ColD Fusion outperforms RoBERTa and even previous multitask models. Specifically, when training and testing on 35 diverse datasets, ColD Fusion-based model outperforms RoBERTa by 2.33 points on average without any changes to the architecture.
Comment: ACL 23
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2212.01378
رقم الأكسشن: edsarx.2212.01378
قاعدة البيانات: arXiv