تقرير
Refined Statistical Bounds for Classification Error Mismatches with Constrained Bayes Error
العنوان: | Refined Statistical Bounds for Classification Error Mismatches with Constrained Bayes Error |
---|---|
المؤلفون: | Yang, Zijian, Eminyan, Vahe, Schlüter, Ralf, Ney, Hermann |
سنة النشر: | 2024 |
المجموعة: | Computer Science Mathematics |
مصطلحات موضوعية: | Computer Science - Information Theory |
الوصف: | In statistical classification/multiple hypothesis testing and machine learning, a model distribution estimated from the training data is usually applied to replace the unknown true distribution in the Bayes decision rule, which introduces a mismatch between the Bayes error and the model-based classification error. In this work, we derive the classification error bound to study the relationship between the Kullback-Leibler divergence and the classification error mismatch. We first reconsider the statistical bounds based on classification error mismatch derived in previous works, employing a different method of derivation. Then, motivated by the observation that the Bayes error is typically low in machine learning tasks like speech recognition and pattern recognition, we derive a refined Kullback-Leibler-divergence-based bound on the error mismatch with the constraint that the Bayes error is lower than a threshold. Comment: accepted at 2024 IEEE Information Theory Workshop |
نوع الوثيقة: | Working Paper |
URL الوصول: | http://arxiv.org/abs/2409.01309 |
رقم الأكسشن: | edsarx.2409.01309 |
قاعدة البيانات: | arXiv |
الوصف غير متاح. |