تقرير
Realizable $H$-Consistent and Bayes-Consistent Loss Functions for Learning to Defer
العنوان: | Realizable $H$-Consistent and Bayes-Consistent Loss Functions for Learning to Defer |
---|---|
المؤلفون: | Mao, Anqi, Mohri, Mehryar, Zhong, Yutao |
سنة النشر: | 2024 |
المجموعة: | Computer Science Statistics |
مصطلحات موضوعية: | Computer Science - Machine Learning, Statistics - Machine Learning |
الوصف: | We present a comprehensive study of surrogate loss functions for learning to defer. We introduce a broad family of surrogate losses, parameterized by a non-increasing function $\Psi$, and establish their realizable $H$-consistency under mild conditions. For cost functions based on classification error, we further show that these losses admit $H$-consistency bounds when the hypothesis set is symmetric and complete, a property satisfied by common neural network and linear function hypothesis sets. Our results also resolve an open question raised in previous work (Mozannar et al., 2023) by proving the realizable $H$-consistency and Bayes-consistency of a specific surrogate loss. Furthermore, we identify choices of $\Psi$ that lead to $H$-consistent surrogate losses for any general cost function, thus achieving Bayes-consistency, realizable $H$-consistency, and $H$-consistency bounds simultaneously. We also investigate the relationship between $H$-consistency bounds and realizable $H$-consistency in learning to defer, highlighting key differences from standard classification. Finally, we empirically evaluate our proposed surrogate losses and compare them with existing baselines. |
نوع الوثيقة: | Working Paper |
URL الوصول: | http://arxiv.org/abs/2407.13732 |
رقم الأكسشن: | edsarx.2407.13732 |
قاعدة البيانات: | arXiv |
كن أول من يترك تعليقا!