Knowledge Fusion of Chat LLMs: A Preliminary Technical Report

التفاصيل البيبلوغرافية
العنوان: Knowledge Fusion of Chat LLMs: A Preliminary Technical Report
المؤلفون: Wan, Fanqi, Yang, Ziyi, Zhong, Longguang, Quan, Xiaojun, Huang, Xinting, Bi, Wei
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computation and Language
الوصف: Recently, FuseLLM introduced the concept of knowledge fusion to transfer the collective knowledge of multiple structurally varied LLMs into a target LLM through lightweight continual training. In this report, we extend the scalability and flexibility of the FuseLLM framework to realize the fusion of chat LLMs, resulting in FusionChat. FusionChat comprises two main stages. Firstly, we undertake knowledge fusion for structurally and scale-varied source LLMs to derive multiple target LLMs of identical structure and size via lightweight fine-tuning. Then, these target LLMs are merged within the parameter space, wherein we propose a novel method for determining the merging weights based on the variation ratio of parameter matrices before and after fine-tuning. We validate our approach using three prominent chat LLMs with diverse architectures and scales, namely NH2-Mixtral-8x7B, NH2-Solar-10.7B, and OpenChat-3.5-7B. Experimental results spanning various chat domains demonstrate the superiority of FusionChat-7B across a broad spectrum of chat LLMs at 7B and 34B scales, even surpassing GPT-3.5 (March) and approaching Mixtral-8x7B-Instruct.
Comment: Technical Report, work in progress
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2402.16107
رقم الأكسشن: edsarx.2402.16107
قاعدة البيانات: arXiv