$\alpha$-Divergence Loss Function for Neural Density Ratio Estimation

التفاصيل البيبلوغرافية
العنوان: $\alpha$-Divergence Loss Function for Neural Density Ratio Estimation
المؤلفون: Kitazawa, Yoshiaki
سنة النشر: 2024
المجموعة: Computer Science
Statistics
مصطلحات موضوعية: Statistics - Machine Learning, Computer Science - Machine Learning
الوصف: Recently, neural networks have produced state-of-the-art results for density-ratio estimation (DRE), a fundamental technique in machine learning. However, existing methods bear optimization issues that arise from the loss functions of DRE: a large sample requirement of Kullback--Leibler (KL)-divergence, vanishing of train loss gradients, and biased gradients of the loss functions. Thus, an $\alpha$-divergence loss function ($\alpha$-Div) that offers concise implementation and stable optimization is proposed in this paper. Furthermore, technical justifications for the proposed loss function are presented. The stability of the proposed loss function is empirically demonstrated and the estimation accuracy of DRE tasks is investigated. Additionally, this study presents a sample requirement for DRE using the proposed loss function in terms of the upper bound of $L_1$ error, which connects a curse of dimensionality as a common problem in high-dimensional DRE tasks.
Comment: $\mathcal{T}_{\text{Lip}}$ in Theorem 7.1 (Theorem B.15.) was changed to the set of all locally Lipschitz continuous functions. In the previous version, $\mathcal{T}_{\text{Lip}}$ was defined as the set of all Lipschitz continuous functions, which is unsuitable for the statement of case (ii) in the theorem
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2402.02041
رقم الأكسشن: edsarx.2402.02041
قاعدة البيانات: arXiv