Backpropagation-Free Graph Convolutional Networks (BFGCN) are backpropagation-free neural models dealing with graph data based on Gated Linear Networks. Each neuron in a BF-GCN is defined as a set of graph convolution filters (weight vectors) and a gating mechanism that, given a node’s context, selects the weight vector to use for processing the node’s attributes based on its distance from a set of prototypes. Given the higher expressivity BF-GNN’s neurons compared to the standard graph convolutional neural networks’ ones, they show bigger memory footprint. In this paper, we explore how reducing the size of node contexts through randomization can reduce the memory occupancy of the method, enabling its application to huge datasets. We empirically show how working with very low dimensional contexts does not impact the resulting predictive performances.

Towards the Application of Backpropagation-Free Graph Convolutional Networks on Huge Datasets

Nicolo Navarin;Luca Pasa;Alessandro Sperduti
2024

Abstract

Backpropagation-Free Graph Convolutional Networks (BFGCN) are backpropagation-free neural models dealing with graph data based on Gated Linear Networks. Each neuron in a BF-GCN is defined as a set of graph convolution filters (weight vectors) and a gating mechanism that, given a node’s context, selects the weight vector to use for processing the node’s attributes based on its distance from a set of prototypes. Given the higher expressivity BF-GNN’s neurons compared to the standard graph convolutional neural networks’ ones, they show bigger memory footprint. In this paper, we explore how reducing the size of node contexts through randomization can reduce the memory occupancy of the method, enabling its application to huge datasets. We empirically show how working with very low dimensional contexts does not impact the resulting predictive performances.
2024
ESANN 2024 proceedings, European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning
European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning
978-2-87587-090-2
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3537438
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex ND
social impact