Graph attention models (A-GNNs), a type of Graph Neural Networks (GNNs), have been shown to be more powerful than simpler convolutional GNNs (C-GNNs). However, A-GNNs are more complex to program and difficult to scale. To address this, we develop a novel mathematical formulation, based on tensors that group all the feature vectors, targeting both training and inference of A-GNNs. The formulation enables straightforward adoption of communication-minimizing routines, it fosters optimizations such as vectorization, and it enables seamless integration with established linear algebra DSLs or libraries such as GraphBLAS. Our implementation uses a data redistribution scheme explicitly developed for sparse-dense tensor operations used heavily in GNNs, and fusing optimizations that further minimize memory usage and communication cost. We ensure theoretical asymptotic reductions in communicated data compared to the established message-passing GNN paradigm. Finally, we provide excellent scalability and speedups of even 4 - 5x over modern libraries such as Deep Graph Library.

High-Performance and Programmable Attentional Graph Neural Networks with Global Tensor Formulations / Besta, Maciej; Renc, Pawel; Gerstenberger, Robert; Sylos Labini, Paolo; Ziogas, Alexandros; Chen, Tiancheng; Gianinazzi, Lukas; Scheidl, Florian; Szenes, Kalman; Carigiet, Armon; Iff, Patrick; Kwasniewski, Grzegorz; Kanakagiri, Raghavendra; Ge, Chio; Jaeger, Sammy; Wąs, Jarosław; Vella, Flavio; Hoefler, Torsten. - 66(2023), pp. 1-16. ( 2023 International Conference for High Performance Computing, Networking, Storage and Analysis, SC 2023 Stati Uniti d'America 2023) [10.1145/3581784.3607067].

High-Performance and Programmable Attentional Graph Neural Networks with Global Tensor Formulations

Flavio Vella;
2023-01-01

Abstract

Graph attention models (A-GNNs), a type of Graph Neural Networks (GNNs), have been shown to be more powerful than simpler convolutional GNNs (C-GNNs). However, A-GNNs are more complex to program and difficult to scale. To address this, we develop a novel mathematical formulation, based on tensors that group all the feature vectors, targeting both training and inference of A-GNNs. The formulation enables straightforward adoption of communication-minimizing routines, it fosters optimizations such as vectorization, and it enables seamless integration with established linear algebra DSLs or libraries such as GraphBLAS. Our implementation uses a data redistribution scheme explicitly developed for sparse-dense tensor operations used heavily in GNNs, and fusing optimizations that further minimize memory usage and communication cost. We ensure theoretical asymptotic reductions in communicated data compared to the established message-passing GNN paradigm. Finally, we provide excellent scalability and speedups of even 4 - 5x over modern libraries such as Deep Graph Library.
2023
SC '23: Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis
New York, USA
Association for Computing Machinery, Inc
9798400701092
Besta, Maciej; Renc, Pawel; Gerstenberger, Robert; Sylos Labini, Paolo; Ziogas, Alexandros; Chen, Tiancheng; Gianinazzi, Lukas; Scheidl, Florian; Szen...espandi
High-Performance and Programmable Attentional Graph Neural Networks with Global Tensor Formulations / Besta, Maciej; Renc, Pawel; Gerstenberger, Robert; Sylos Labini, Paolo; Ziogas, Alexandros; Chen, Tiancheng; Gianinazzi, Lukas; Scheidl, Florian; Szenes, Kalman; Carigiet, Armon; Iff, Patrick; Kwasniewski, Grzegorz; Kanakagiri, Raghavendra; Ge, Chio; Jaeger, Sammy; Wąs, Jarosław; Vella, Flavio; Hoefler, Torsten. - 66(2023), pp. 1-16. ( 2023 International Conference for High Performance Computing, Networking, Storage and Analysis, SC 2023 Stati Uniti d'America 2023) [10.1145/3581784.3607067].
File in questo prodotto:
File Dimensione Formato  
3581784.3607067.pdf

Solo gestori archivio

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 2.75 MB
Formato Adobe PDF
2.75 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/401009
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 0
  • OpenAlex 3
social impact