BioMegatron: Larger Biomedical Domain Language Model

التفاصيل البيبلوغرافية
العنوان: BioMegatron: Larger Biomedical Domain Language Model
المؤلفون: Raghav Mani, Evelina Bakhturina, Mohammad Shoeybi, Raul Puri, Hoo-Chang Shin, Mostofa Patwary, Yang Zhang
المصدر: EMNLP (1)
بيانات النشر: arXiv, 2020.
سنة النشر: 2020
مصطلحات موضوعية: Text corpus, FOS: Computer and information sciences, Vocabulary, Computer science, media_common.quotation_subject, 02 engineering and technology, computer.software_genre, Domain (software engineering), Set (abstract data type), 03 medical and health sciences, Named-entity recognition, 0202 electrical engineering, electronic engineering, information engineering, Question answering, 030304 developmental biology, media_common, 0303 health sciences, Computer Science - Computation and Language, business.industry, Relationship extraction, 020201 artificial intelligence & image processing, Artificial intelligence, Language model, business, computer, Computation and Language (cs.CL), Natural language processing
الوصف: There has been an influx of biomedical domain-specific language models, showing language models pre-trained on biomedical text perform better on biomedical domain benchmarks than those trained on general domain text corpora such as Wikipedia and Books. Yet, most works do not study the factors affecting each domain language application deeply. Additionally, the study of model size on domain-specific models has been mostly missing. We empirically study and evaluate several factors that can affect performance on domain language applications, such as the sub-word vocabulary set, model size, pre-training corpus, and domain transfer. We show consistent improvements on benchmarks with our larger BioMegatron model trained on a larger domain corpus, contributing to our understanding of domain language model applications. We demonstrate noticeable improvements over the previous state-of-the-art (SOTA) on standard biomedical NLP benchmarks of named entity recognition, relation extraction, and question answering. Model checkpoints and code are available at [https://ngc.nvidia.com] and [https://github.com/NVIDIA/NeMo].
Comment: Accepted for publication at EMNLP 2020
DOI: 10.48550/arxiv.2010.06060
URL الوصول: https://explore.openaire.eu/search/publication?articleId=doi_dedup___::a513477f59bbb8a3f2e9946a120ddf8e
حقوق: OPEN
رقم الأكسشن: edsair.doi.dedup.....a513477f59bbb8a3f2e9946a120ddf8e
قاعدة البيانات: OpenAIRE
الوصف
DOI:10.48550/arxiv.2010.06060