OCHADAI-KYOTO at SemEval-2021 Task 1: Enhancing Model Generalization and Robustness for Lexical Complexity Prediction

التفاصيل البيبلوغرافية
العنوان: OCHADAI-KYOTO at SemEval-2021 Task 1: Enhancing Model Generalization and Robustness for Lexical Complexity Prediction
المؤلفون: Taya, Yuki, Pereira, Lis Kanashiro, Cheng, Fei, Kobayashi, Ichiro
سنة النشر: 2021
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computation and Language
الوصف: We propose an ensemble model for predicting the lexical complexity of words and multiword expressions (MWEs). The model receives as input a sentence with a target word or MWEand outputs its complexity score. Given that a key challenge with this task is the limited size of annotated data, our model relies on pretrained contextual representations from different state-of-the-art transformer-based language models (i.e., BERT and RoBERTa), and on a variety of training methods for further enhancing model generalization and robustness:multi-step fine-tuning and multi-task learning, and adversarial training. Additionally, we propose to enrich contextual representations by adding hand-crafted features during training. Our model achieved competitive results and ranked among the top-10 systems in both sub-tasks.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2105.05535
رقم الأكسشن: edsarx.2105.05535
قاعدة البيانات: arXiv