Log Parsing with Self-Generated In-Context Learning and Self-Correction

التفاصيل البيبلوغرافية
العنوان: Log Parsing with Self-Generated In-Context Learning and Self-Correction
المؤلفون: Wu, Yifan, Yu, Siyu, Li, Ying
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Software Engineering
الوصف: Log parsing transforms log messages into structured formats, serving as a crucial step for log analysis. Despite a variety of log parsing methods that have been proposed, their performance on evolving log data remains unsatisfactory due to reliance on human-crafted rules or learning-based models with limited training data. The recent emergence of large language models (LLMs) has demonstrated strong abilities in understanding natural language and code, making it promising to apply LLMs for log parsing. Consequently, several studies have proposed LLM-based log parsers. However, LLMs may produce inaccurate templates, and existing LLM-based log parsers directly use the template generated by the LLM as the parsing result, hindering the accuracy of log parsing. Furthermore, these log parsers depend heavily on historical log data as demonstrations, which poses challenges in maintaining accuracy when dealing with scarce historical log data or evolving log data. To address these challenges, we propose AdaParser, an effective and adaptive log parsing framework using LLMs with self-generated in-context learning (SG-ICL) and self-correction. To facilitate accurate log parsing, AdaParser incorporates a novel component, a template corrector, which utilizes the LLM to correct potential parsing errors in the templates it generates. In addition, AdaParser maintains a dynamic candidate set composed of previously generated templates as demonstrations to adapt evolving log data. Extensive experiments on public large-scale datasets show that AdaParser outperforms state-of-the-art methods across all metrics, even in zero-shot scenarios. Moreover, when integrated with different LLMs, AdaParser consistently enhances the performance of the utilized LLMs by a large margin.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2406.03376
رقم الأكسشن: edsarx.2406.03376
قاعدة البيانات: arXiv