تقرير
One-Bit Quantization and Sparsification for Multiclass Linear Classification via Regularized Regression
العنوان: | One-Bit Quantization and Sparsification for Multiclass Linear Classification via Regularized Regression |
---|---|
المؤلفون: | Ghane, Reza, Akhtiamov, Danil, Hassibi, Babak |
سنة النشر: | 2024 |
المجموعة: | Computer Science Statistics |
مصطلحات موضوعية: | Computer Science - Machine Learning, Statistics - Machine Learning |
الوصف: | We study the use of linear regression for multiclass classification in the over-parametrized regime where some of the training data is mislabeled. In such scenarios it is necessary to add an explicit regularization term, $\lambda f(w)$, for some convex function $f(\cdot)$, to avoid overfitting the mislabeled data. In our analysis, we assume that the data is sampled from a Gaussian Mixture Model with equal class sizes, and that a proportion $c$ of the training labels is corrupted for each class. Under these assumptions, we prove that the best classification performance is achieved when $f(\cdot) = \|\cdot\|^2_2$ and $\lambda \to \infty$. We then proceed to analyze the classification errors for $f(\cdot) = \|\cdot\|_1$ and $f(\cdot) = \|\cdot\|_\infty$ in the large $\lambda$ regime and notice that it is often possible to find sparse and one-bit solutions, respectively, that perform almost as well as the one corresponding to $f(\cdot) = \|\cdot\|_2^2$. |
نوع الوثيقة: | Working Paper |
URL الوصول: | http://arxiv.org/abs/2402.10474 |
رقم الأكسشن: | edsarx.2402.10474 |
قاعدة البيانات: | arXiv |
الوصف غير متاح. |