IntTower: the Next Generation of Two-Tower Model for Pre-Ranking System

التفاصيل البيبلوغرافية
العنوان: IntTower: the Next Generation of Two-Tower Model for Pre-Ranking System
المؤلفون: Li, Xiangyang, Chen, Bo, Guo, HuiFeng, Li, Jingjie, Zhu, Chenxu, Long, Xiang, Li, Sujian, Wang, Yichao, Guo, Wei, Mao, Longxia, Liu, Jinxing, Dong, Zhenhua, Tang, Ruiming
سنة النشر: 2022
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Information Retrieval
الوصف: Scoring a large number of candidates precisely in several milliseconds is vital for industrial pre-ranking systems. Existing pre-ranking systems primarily adopt the \textbf{two-tower} model since the ``user-item decoupling architecture'' paradigm is able to balance the \textit{efficiency} and \textit{effectiveness}. However, the cost of high efficiency is the neglect of the potential information interaction between user and item towers, hindering the prediction accuracy critically. In this paper, we show it is possible to design a two-tower model that emphasizes both information interactions and inference efficiency. The proposed model, IntTower (short for \textit{Interaction enhanced Two-Tower}), consists of Light-SE, FE-Block and CIR modules. Specifically, lightweight Light-SE module is used to identify the importance of different features and obtain refined feature representations in each tower. FE-Block module performs fine-grained and early feature interactions to capture the interactive signals between user and item towers explicitly and CIR module leverages a contrastive interaction regularization to further enhance the interactions implicitly. Experimental results on three public datasets show that IntTower outperforms the SOTA pre-ranking models significantly and even achieves comparable performance in comparison with the ranking models. Moreover, we further verify the effectiveness of IntTower on a large-scale advertisement pre-ranking system. The code of IntTower is publicly available\footnote{https://github.com/archersama/IntTower}
Comment: Accept by CIKM 2022 & DLP-KDD best paper
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2210.09890
رقم الأكسشن: edsarx.2210.09890
قاعدة البيانات: arXiv