Accelerating Sparse DNNs Based on Tiled GEMM

التفاصيل البيبلوغرافية
العنوان: Accelerating Sparse DNNs Based on Tiled GEMM
المؤلفون: Guo, Cong, Xue, Fengchen, Leng, Jingwen, Qiu, Yuxian, Guan, Yue, Cui, Weihao, Chen, Quan, Guo, Minyi
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Distributed, Parallel, and Cluster Computing
الوصف: Network pruning can reduce the computation cost of deep neural network (DNN) models. However, sparse models often produce randomly-distributed weights to maintain accuracy, leading to irregular computations. Consequently, unstructured sparse models cannot achieve meaningful speedup on commodity hardware built for dense matrix computations. Accelerators are usually modified or designed with structured sparsity-optimized architectures for exploiting sparsity. For example, the Ampere architecture introduces a sparse tensor core, which adopts the 2:4 sparsity pattern. We propose a pruning method that builds upon the insight that matrix multiplication generally breaks the large matrix into multiple smaller tiles for parallel execution. We present the tile-wise sparsity pattern, which maintains a structured sparsity pattern at the tile level for efficient execution but allows for irregular pruning at the global scale to maintain high accuracy. In addition, the tile-wise sparsity is implemented at the global memory level, and the 2:4 sparsity executes at the register level inside the sparse tensor core. We can combine these two patterns into a tile-vector-wise (TVW) sparsity pattern to explore more fine-grained sparsity and further accelerate the sparse DNN models. We evaluate the TVW on the GPU, achieving averages of $1.85\times$, $2.75\times$, and $22.18\times$ speedups over the dense model, block sparsity, and unstructured sparsity.
Comment: Accepted by IEEE Transactions on Computers. arXiv admin note: substantial text overlap with arXiv:2008.13006
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2402.10876
رقم الأكسشن: edsarx.2402.10876
قاعدة البيانات: arXiv