Aquila2 Technical Report

التفاصيل البيبلوغرافية
العنوان: Aquila2 Technical Report
المؤلفون: Zhang, Bo-Wen, Wang, Liangdong, Li, Jijie, Gu, Shuhao, Wu, Xinya, Zhang, Zhengduo, Gao, Boyan, Ao, Yulong, Liu, Guang
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computation and Language
الوصف: This paper introduces the Aquila2 series, which comprises a wide range of bilingual models with parameter sizes of 7, 34, and 70 billion. These models are trained based on an innovative framework named HeuriMentor (HM), which offers real-time insights into model convergence and enhances the training process and data management. The HM System, comprising the Adaptive Training Engine (ATE), Training State Monitor (TSM), and Data Management Unit (DMU), allows for precise monitoring of the model's training progress and enables efficient optimization of data distribution, thereby enhancing training effectiveness. Extensive evaluations show that the Aquila2 model series performs comparably well on both English and Chinese benchmarks. Specifically, Aquila2-34B demonstrates only a slight decrease in performance when quantized to Int4. Furthermore, we have made our training code (https://github.com/FlagOpen/FlagScale) and model weights (https://github.com/FlagAI-Open/Aquila2) publicly available to support ongoing research and the development of applications.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2408.07410
رقم الأكسشن: edsarx.2408.07410
قاعدة البيانات: arXiv