On the Effectiveness of Large Language Models in Domain-Specific Code Generation

التفاصيل البيبلوغرافية
العنوان: On the Effectiveness of Large Language Models in Domain-Specific Code Generation
المؤلفون: Lin, Yalan, Chen, Meng, Hu, Yuhan, Zhang, Hongyu, Wan, Chengcheng, Wei, Zhao, Xu, Yong, Wang, Juhong, Gu, Xiaodong
سنة النشر: 2023
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Software Engineering
الوصف: Large language models (LLMs) such as ChatGPT have shown remarkable capabilities in code generation. Despite significant achievements, they rely on enormous training data to acquire a broad spectrum of open-domain knowledge. Besides, their evaluation revolves around open-domain benchmarks like HumanEval, which primarily consist of programming contests. Therefore, it is hard to fully characterize the intricacies and challenges associated with particular domains (e.g., web, game, and math). In this paper, we conduct an in-depth study of the LLMs in domain-specific code generation. Our results demonstrate that LLMs exhibit sub-optimal performance in generating domain-specific code, due to their limited proficiency in utilizing domain-specific libraries. We further observe that incorporating API knowledge as prompts can empower LLMs to generate more professional code. Based on these findings, we further investigate how to effectively incorporate API knowledge into the code generation process. We experiment with three strategies for incorporating domain knowledge, namely, external knowledge inquirer, chain-of-thought prompting, and chain-of-thought fine-tuning. We refer to these strategies as a new code generation approach called DomCoder. Experimental results show that all strategies of DomCoder lead to improvement in the effectiveness of domain-specific code generation under certain settings.
Comment: Accepted by the ACM Transactions on Software Engineering and Methodology (TOSEM 2024)
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2312.01639
رقم الأكسشن: edsarx.2312.01639
قاعدة البيانات: arXiv