SCV-GNN: Sparse Compressed Vector-based Graph Neural Network Aggregation

التفاصيل البيبلوغرافية
العنوان: SCV-GNN: Sparse Compressed Vector-based Graph Neural Network Aggregation
المؤلفون: Unnikrishnan, Nanda K., Gould, Joe, Parhi, Keshab K.
المصدر: IEEE Transactions on Computer Aided Design (TCAD), 2023
سنة النشر: 2023
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Hardware Architecture
الوصف: Graph neural networks (GNNs) have emerged as a powerful tool to process graph-based data in fields like communication networks, molecular interactions, chemistry, social networks, and neuroscience. GNNs are characterized by the ultra-sparse nature of their adjacency matrix that necessitates the development of dedicated hardware beyond general-purpose sparse matrix multipliers. While there has been extensive research on designing dedicated hardware accelerators for GNNs, few have extensively explored the impact of the sparse storage format on the efficiency of the GNN accelerators. This paper proposes SCV-GNN with the novel sparse compressed vectors (SCV) format optimized for the aggregation operation. We use Z-Morton ordering to derive a data-locality-based computation ordering and partitioning scheme. The paper also presents how the proposed SCV-GNN is scalable on a vector processing system. Experimental results over various datasets show that the proposed method achieves a geometric mean speedup of $7.96\times$ and $7.04\times$ over CSC and CSR aggregation operations, respectively. The proposed method also reduces the memory traffic by a factor of $3.29\times$ and $4.37\times$ over compressed sparse column (CSC) and compressed sparse row (CSR), respectively. Thus, the proposed novel aggregation format reduces the latency and memory access for GNN inference.
نوع الوثيقة: Working Paper
DOI: 10.1109/TCAD.2023.3291672
URL الوصول: http://arxiv.org/abs/2304.13532
رقم الأكسشن: edsarx.2304.13532
قاعدة البيانات: arXiv
الوصف
DOI:10.1109/TCAD.2023.3291672