Are Character-level Translations Worth the Wait? Comparing ByT5 and mT5 for Machine Translation

التفاصيل البيبلوغرافية
العنوان: Are Character-level Translations Worth the Wait? Comparing ByT5 and mT5 for Machine Translation
المؤلفون: Edman, Lukas, Sarti, Gabriele, Toral, Antonio, van Noord, Gertjan, Bisazza, Arianna
سنة النشر: 2023
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computation and Language
الوصف: Pretrained character-level and byte-level language models have been shown to be competitive with popular subword models across a range of Natural Language Processing (NLP) tasks. However, there has been little research on their effectiveness for neural machine translation (NMT), particularly within the popular pretrain-then-finetune paradigm. This work performs an extensive comparison across multiple languages and experimental conditions of character- and subword-level pretrained models (ByT5 and mT5, respectively) on NMT. We show the effectiveness of character-level modeling in translation, particularly in cases where fine-tuning data is limited. In our analysis, we show how character models' gains in translation quality are reflected in better translations of orthographically similar words and rare words. While evaluating the importance of source texts in driving model predictions, we highlight word-level patterns within ByT5, suggesting an ability to modulate word-level and character-level information during generation. We conclude by assessing the efficiency tradeoff of byte models, suggesting their usage in non-time-critical scenarios to boost translation quality.
Comment: This version of our work is a pre-MIT Press publication version
نوع الوثيقة: Working Paper
DOI: 10.1162/tacl_a_00651
URL الوصول: http://arxiv.org/abs/2302.14220
رقم الأكسشن: edsarx.2302.14220
قاعدة البيانات: arXiv