Calibrate: Interactive Analysis of Probabilistic Model Output

التفاصيل البيبلوغرافية
العنوان: Calibrate: Interactive Analysis of Probabilistic Model Output
المؤلفون: Xenopoulos, Peter, Rulff, Joao, Nonato, Luis Gustavo, Barr, Brian, Silva, Claudio
سنة النشر: 2022
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Human-Computer Interaction, Computer Science - Machine Learning
الوصف: Analyzing classification model performance is a crucial task for machine learning practitioners. While practitioners often use count-based metrics derived from confusion matrices, like accuracy, many applications, such as weather prediction, sports betting, or patient risk prediction, rely on a classifier's predicted probabilities rather than predicted labels. In these instances, practitioners are concerned with producing a calibrated model, that is, one which outputs probabilities that reflect those of the true distribution. Model calibration is often analyzed visually, through static reliability diagrams, however, the traditional calibration visualization may suffer from a variety of drawbacks due to the strong aggregations it necessitates. Furthermore, count-based approaches are unable to sufficiently analyze model calibration. We present Calibrate, an interactive reliability diagram that addresses the aforementioned issues. Calibrate constructs a reliability diagram that is resistant to drawbacks in traditional approaches, and allows for interactive subgroup analysis and instance-level inspection. We demonstrate the utility of Calibrate through use cases on both real-world and synthetic data. We further validate Calibrate by presenting the results of a think-aloud experiment with data scientists who routinely analyze model calibration.
Comment: Accepted to IEEE VIS 2022
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2207.13770
رقم الأكسشن: edsarx.2207.13770
قاعدة البيانات: arXiv