Mapping Transformer Leveraged Embeddings for Cross-Lingual Document Representation

التفاصيل البيبلوغرافية
العنوان: Mapping Transformer Leveraged Embeddings for Cross-Lingual Document Representation
المؤلفون: Tashu, Tsegaye Misikir, Kontos, Eduard-Raul, Sabatelli, Matthia, Valdenegro-Toro, Matias
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computation and Language, Computer Science - Artificial Intelligence, Computer Science - Information Retrieval, Computer Science - Machine Learning
الوصف: Recommendation systems, for documents, have become tools to find relevant content on the Web. However, these systems have limitations when it comes to recommending documents in languages different from the query language, which means they might overlook resources in non-native languages. This research focuses on representing documents across languages by using Transformer Leveraged Document Representations (TLDRs) that are mapped to a cross-lingual domain. Four multilingual pre-trained transformer models (mBERT, mT5 XLM RoBERTa, ErnieM) were evaluated using three mapping methods across 20 language pairs representing combinations of five selected languages of the European Union. Metrics like Mate Retrieval Rate and Reciprocal Rank were used to measure the effectiveness of mapped TLDRs compared to non-mapped ones. The results highlight the power of cross-lingual representations achieved through pre-trained transformers and mapping approaches suggesting a promising direction for expanding beyond language connections, between two specific languages.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2401.06583
رقم الأكسشن: edsarx.2401.06583
قاعدة البيانات: arXiv