CrAM: A Compression-Aware Minimizer

التفاصيل البيبلوغرافية
العنوان: CrAM: A Compression-Aware Minimizer
المؤلفون: Peste, Alexandra, Vladu, Adrian, Kurtic, Eldar, Lampert, Christoph H., Alistarh, Dan
سنة النشر: 2022
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning
الوصف: Deep neural networks (DNNs) often have to be compressed, via pruning and/or quantization, before they can be deployed in practical settings. In this work we propose a new compression-aware minimizer dubbed CrAM that modifies the optimization step in a principled way, in order to produce models whose local loss behavior is stable under compression operations such as pruning. Thus, dense models trained via CrAM should be compressible post-training, in a single step, without significant accuracy loss. Experimental results on standard benchmarks, such as residual networks for ImageNet classification and BERT models for language modelling, show that CrAM produces dense models that can be more accurate than the standard SGD/Adam-based baselines, but which are stable under weight pruning: specifically, we can prune models in one-shot to 70-80% sparsity with almost no accuracy loss, and to 90% with reasonable ($\sim 1\%$) accuracy loss, which is competitive with gradual compression methods. Additionally, CrAM can produce sparse models which perform well for transfer learning, and it also works for semi-structured 2:4 pruning patterns supported by GPU hardware. The code for reproducing the results is available at https://github.com/IST-DASLab/CrAM .
Comment: Accepted to ICLR 2023
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2207.14200
رقم الأكسشن: edsarx.2207.14200
قاعدة البيانات: arXiv