TexShape: Information Theoretic Sentence Embedding for Language Models

التفاصيل البيبلوغرافية
العنوان: TexShape: Information Theoretic Sentence Embedding for Language Models
المؤلفون: Kale, Kaan, Esfahanizadeh, Homa, Elias, Noel, Baser, Oguzhan, Medard, Muriel, Vishwanath, Sriram
سنة النشر: 2024
المجموعة: Computer Science
Mathematics
مصطلحات موضوعية: Computer Science - Computation and Language, Computer Science - Information Theory
الوصف: With the exponential growth in data volume and the emergence of data-intensive applications, particularly in the field of machine learning, concerns related to resource utilization, privacy, and fairness have become paramount. This paper focuses on the textual domain of data and addresses challenges regarding encoding sentences to their optimized representations through the lens of information-theory. In particular, we use empirical estimates of mutual information, using the Donsker-Varadhan definition of Kullback-Leibler divergence. Our approach leverages this estimation to train an information-theoretic sentence embedding, called TexShape, for (task-based) data compression or for filtering out sensitive information, enhancing privacy and fairness. In this study, we employ a benchmark language model for initial text representation, complemented by neural networks for information-theoretic compression and mutual information estimations. Our experiments demonstrate significant advancements in preserving maximal targeted information and minimal sensitive information over adverse compression ratios, in terms of predictive accuracy of downstream models that are trained using the compressed data.
Comment: Submitted to the 2024 IEEE International Symposium on Information Theory
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2402.05132
رقم الأكسشن: edsarx.2402.05132
قاعدة البيانات: arXiv