تقرير
HPC-GPT: Integrating Large Language Model for High-Performance Computing
العنوان: | HPC-GPT: Integrating Large Language Model for High-Performance Computing |
---|---|
المؤلفون: | Ding, Xianzhong, Chen, Le, Emani, Murali, Liao, Chunhua, Lin, Pei-Hung, Vanderbruggen, Tristan, Xie, Zhen, Cerpa, Alberto E., Du, Wan |
سنة النشر: | 2023 |
المجموعة: | Computer Science |
مصطلحات موضوعية: | Computer Science - Distributed, Parallel, and Cluster Computing, Computer Science - Artificial Intelligence, Computer Science - Computation and Language |
الوصف: | Large Language Models (LLMs), including the LLaMA model, have exhibited their efficacy across various general-domain natural language processing (NLP) tasks. However, their performance in high-performance computing (HPC) domain tasks has been less than optimal due to the specialized expertise required to interpret the model responses. In response to this challenge, we propose HPC-GPT, a novel LLaMA-based model that has been supervised fine-tuning using generated QA (Question-Answer) instances for the HPC domain. To evaluate its effectiveness, we concentrate on two HPC tasks: managing AI models and datasets for HPC, and data race detection. By employing HPC-GPT, we demonstrate comparable performance with existing methods on both tasks, exemplifying its excellence in HPC-related scenarios. Our experiments on open-source benchmarks yield extensive results, underscoring HPC-GPT's potential to bridge the performance gap between LLMs and HPC-specific tasks. With HPC-GPT, we aim to pave the way for LLMs to excel in HPC domains, simplifying the utilization of language models in complex computing applications. Comment: 9 pages |
نوع الوثيقة: | Working Paper |
DOI: | 10.1145/3624062.3624172 |
URL الوصول: | http://arxiv.org/abs/2311.12833 |
رقم الأكسشن: | edsarx.2311.12833 |
قاعدة البيانات: | arXiv |
DOI: | 10.1145/3624062.3624172 |
---|