SynCPKL: Harnessing LLMs to Generate Synthetic Data for Commonsense Persona Knowledge Linking

التفاصيل البيبلوغرافية
العنوان: SynCPKL: Harnessing LLMs to Generate Synthetic Data for Commonsense Persona Knowledge Linking
المؤلفون: Lin, Kuan-Yen
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computation and Language
الوصف: Understanding rich dialogues often requires NLP systems to access relevant commonsense persona knowledge, but retrieving this knowledge is challenging due to complex contexts and the implicit nature of commonsense. This paper presents our approach to the Commonsense Persona Knowledge Linking (CPKL) challenge, addressing the critical need for integrating persona and commonsense knowledge in open-domain dialogue systems. We introduce SynCPKL Pipeline, a pipeline that leverages Large Language Models to generate high-quality synthetic datasets for training commonsense persona knowledge linkers. To demonstrate the efficacy of our approach, we present SynCPKL, a new dataset specifically designed for this task. Our experiments validate the effectiveness of SynCPKL for training commonsense persona knowledge linkers. Additionally, our top-performing model, Derberta-SynCPKL, secured first place in the CPKL challenge by a 16% improvement in F1 score. We released both SynCPKL and Derberta-SynCPKL at https://github.com/irislin1006/CPKL.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2407.15281
رقم الأكسشن: edsarx.2407.15281
قاعدة البيانات: arXiv