Fusing finetuned models for better pretraining

التفاصيل البيبلوغرافية
العنوان: Fusing finetuned models for better pretraining
المؤلفون: Choshen, Leshem, Venezian, Elad, Slonim, Noam, Katz, Yoav
سنة النشر: 2022
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computation and Language, Computer Science - Computer Vision and Pattern Recognition, Computer Science - Machine Learning
الوصف: Pretrained models are the standard starting point for training. This approach consistently outperforms the use of a random initialization. However, pretraining is a costly endeavour that few can undertake. In this paper, we create better base models at hardly any cost, by fusing multiple existing fine tuned models into one. Specifically, we fuse by averaging the weights of these models. We show that the fused model results surpass the pretrained model ones. We also show that fusing is often better than intertraining. We find that fusing is less dependent on the target task. Furthermore, weight decay nullifies intertraining effects but not those of fusing.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2204.03044
رقم الأكسشن: edsarx.2204.03044
قاعدة البيانات: arXiv