Plausibility Processing in Transformer Language Models: Focusing on the Role of Attention Heads in GPT

التفاصيل البيبلوغرافية
العنوان: Plausibility Processing in Transformer Language Models: Focusing on the Role of Attention Heads in GPT
المؤلفون: Ryu, Soo Hyun
سنة النشر: 2023
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computation and Language
الوصف: The goal of this paper is to explore how Transformer language models process semantic knowledge, especially regarding the plausibility of noun-verb relations. First, I demonstrate GPT2 exhibits a higher degree of similarity with humans in plausibility processing compared to other Transformer language models. Next, I delve into how knowledge of plausibility is contained within attention heads of GPT2 and how these heads causally contribute to GPT2's plausibility processing ability. Through several experiments, it was found that: i) GPT2 has a number of attention heads that detect plausible noun-verb relationships; ii) these heads collectively contribute to the Transformer's ability to process plausibility, albeit to varying degrees; and iii) attention heads' individual performance in detecting plausibility does not necessarily correlate with how much they contribute to GPT2's plausibility processing ability.
Comment: EMNLP-findings 2023
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2310.13824
رقم الأكسشن: edsarx.2310.13824
قاعدة البيانات: arXiv