Topology Distillation for Recommender System

التفاصيل البيبلوغرافية
العنوان: Topology Distillation for Recommender System
المؤلفون: Kang, SeongKu, Hwang, Junyoung, Kweon, Wonbin, Yu, Hwanjo
سنة النشر: 2021
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Information Retrieval
الوصف: Recommender Systems (RS) have employed knowledge distillation which is a model compression technique training a compact student model with the knowledge transferred from a pre-trained large teacher model. Recent work has shown that transferring knowledge from the teacher's intermediate layer significantly improves the recommendation quality of the student. However, they transfer the knowledge of individual representation point-wise and thus have a limitation in that primary information of RS lies in the relations in the representation space. This paper proposes a new topology distillation approach that guides the student by transferring the topological structure built upon the relations in the teacher space. We first observe that simply making the student learn the whole topological structure is not always effective and even degrades the student's performance. We demonstrate that because the capacity of the student is highly limited compared to that of the teacher, learning the whole topological structure is daunting for the student. To address this issue, we propose a novel method named Hierarchical Topology Distillation (HTD) which distills the topology hierarchically to cope with the large capacity gap. Our extensive experiments on real-world datasets show that the proposed method significantly outperforms the state-of-the-art competitors. We also provide in-depth analyses to ascertain the benefit of distilling the topology for RS.
Comment: KDD 2021. 9 pages + appendix (2 pages). 8 figures
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2106.08700
رقم الأكسشن: edsarx.2106.08700
قاعدة البيانات: arXiv