Structure-aware Fine-tuning for Code Pre-trained Models

التفاصيل البيبلوغرافية
العنوان: Structure-aware Fine-tuning for Code Pre-trained Models
المؤلفون: Wu, Jiayi, Zhu, Renyu, Chen, Nuo, Sun, Qiushi, Li, Xiang, Gao, Ming
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Software Engineering, Computer Science - Artificial Intelligence, Computer Science - Computation and Language
الوصف: Over the past few years, we have witnessed remarkable advancements in Code Pre-trained Models (CodePTMs). These models achieved excellent representation capabilities by designing structure-based pre-training tasks for code. However, how to enhance the absorption of structural knowledge when fine-tuning CodePTMs still remains a significant challenge. To fill this gap, in this paper, we present Structure-aware Fine-tuning (SAT), a novel structure-enhanced and plug-and-play fine-tuning method for CodePTMs. We first propose a structure loss to quantify the difference between the information learned by CodePTMs and the knowledge extracted from code structure. Specifically, we use the attention scores extracted from Transformer layer as the learned structural information, and the shortest path length between leaves in abstract syntax trees as the structural knowledge. Subsequently, multi-task learning is introduced to improve the performance of fine-tuning. Experiments conducted on four pre-trained models and two generation tasks demonstrate the effectiveness of our proposed method as a plug-and-play solution. Furthermore, we observed that SAT can benefit CodePTMs more with limited training data.
Comment: Accepted by COLING 2024
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2404.07471
رقم الأكسشن: edsarx.2404.07471
قاعدة البيانات: arXiv