Performance-Aligned LLMs for Generating Fast Code

التفاصيل البيبلوغرافية
العنوان: Performance-Aligned LLMs for Generating Fast Code
المؤلفون: Nichols, Daniel, Polasam, Pranav, Menon, Harshitha, Marathe, Aniruddha, Gamblin, Todd, Bhatele, Abhinav
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Distributed, Parallel, and Cluster Computing, Computer Science - Artificial Intelligence, Computer Science - Software Engineering
الوصف: Optimizing scientific software is a difficult task because codebases are often large and complex, and performance can depend upon several factors including the algorithm, its implementation, and hardware among others. Causes of poor performance can originate from disparate sources and be difficult to diagnose. Recent years have seen a multitude of work that use large language models (LLMs) to assist in software development tasks. However, these tools are trained to model the distribution of code as text, and are not specifically designed to understand performance aspects of code. In this work, we introduce a reinforcement learning based methodology to align the outputs of code LLMs with performance. This allows us to build upon the current code modeling capabilities of LLMs and extend them to generate better performing code. We demonstrate that our fine-tuned model improves the expected speedup of generated code over base models for a set of benchmark tasks from 0.9 to 1.6 for serial code and 1.9 to 4.5 for OpenMP code.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2404.18864
رقم الأكسشن: edsarx.2404.18864
قاعدة البيانات: arXiv