Improving Pre-trained Language Model Sensitivity via Mask Specific losses: A case study on Biomedical NER

التفاصيل البيبلوغرافية
العنوان: Improving Pre-trained Language Model Sensitivity via Mask Specific losses: A case study on Biomedical NER
المؤلفون: Abaho, Micheal, Bollegala, Danushka, Leeming, Gary, Joyce, Dan, Buchan, Iain E
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computation and Language, Computer Science - Artificial Intelligence, Computer Science - Information Retrieval, Computer Science - Machine Learning
الوصف: Adapting language models (LMs) to novel domains is often achieved through fine-tuning a pre-trained LM (PLM) on domain-specific data. Fine-tuning introduces new knowledge into an LM, enabling it to comprehend and efficiently perform a target domain task. Fine-tuning can however be inadvertently insensitive if it ignores the wide array of disparities (e.g in word meaning) between source and target domains. For instance, words such as chronic and pressure may be treated lightly in social conversations, however, clinically, these words are usually an expression of concern. To address insensitive fine-tuning, we propose Mask Specific Language Modeling (MSLM), an approach that efficiently acquires target domain knowledge by appropriately weighting the importance of domain-specific terms (DS-terms) during fine-tuning. MSLM jointly masks DS-terms and generic words, then learns mask-specific losses by ensuring LMs incur larger penalties for inaccurately predicting DS-terms compared to generic words. Results of our analysis show that MSLM improves LMs sensitivity and detection of DS-terms. We empirically show that an optimal masking rate not only depends on the LM, but also on the dataset and the length of sequences. Our proposed masking strategy outperforms advanced masking strategies such as span- and PMI-based masking.
Comment: Paper alrerady accepted for publishing by the NAACL 2024 conference (main conference paper)
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2403.18025
رقم الأكسشن: edsarx.2403.18025
قاعدة البيانات: arXiv