AHAM: Adapt, Help, Ask, Model -- Harvesting LLMs for literature mining

التفاصيل البيبلوغرافية
العنوان: AHAM: Adapt, Help, Ask, Model -- Harvesting LLMs for literature mining
المؤلفون: Koloski, Boshko, Lavrač, Nada, Cestnik, Bojan, Pollak, Senja, Škrlj, Blaž, Kastrin, Andrej
سنة النشر: 2023
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computation and Language, Computer Science - Artificial Intelligence
الوصف: In an era marked by a rapid increase in scientific publications, researchers grapple with the challenge of keeping pace with field-specific advances. We present the `AHAM' methodology and a metric that guides the domain-specific \textbf{adapt}ation of the BERTopic topic modeling framework to improve scientific text analysis. By utilizing the LLaMa2 generative language model, we generate topic definitions via one-shot learning by crafting prompts with the \textbf{help} of domain experts to guide the LLM for literature mining by \textbf{asking} it to model the topic names. For inter-topic similarity evaluation, we leverage metrics from language generation and translation processes to assess lexical and semantic similarity of the generated topics. Our system aims to reduce both the ratio of outlier topics to the total number of topics and the similarity between topic definitions. The methodology has been assessed on a newly gathered corpus of scientific papers on literature-based discovery. Through rigorous evaluation by domain experts, AHAM has been validated as effective in uncovering intriguing and novel insights within broad research areas. We explore the impact of domain adaptation of sentence-transformers for the task of topic \textbf{model}ing using two datasets, each specialized to specific scientific domains within arXiv and medarxiv. We evaluate the impact of data size, the niche of adaptation, and the importance of domain adaptation. Our results suggest a strong interaction between domain adaptation and topic modeling precision in terms of outliers and topic definitions.
Comment: Submitted to IDA 2024
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2312.15784
رقم الأكسشن: edsarx.2312.15784
قاعدة البيانات: arXiv