Backpropagation-Free Graph Convolutional Networks (BFGCN) are backpropagation-free neural models dealing with graph data based on Gated Linear Networks. Each neuron in a BF-GCN is defined as a set of graph convolution filters (weight vectors) and a gating mechanism that, given a node’s context, selects the weight vector to use for processing the node’s attributes based on its distance from a set of prototypes. Given the higher expressivity BF-GNN’s neurons compared to the standard graph convolutional neural networks’ ones, they show bigger memory footprint. In this paper, we explore how reducing the size of node contexts through randomization can reduce the memory occupancy of the method, enabling its application to huge datasets. We empirically show how working with very low dimensional contexts does not impact the resulting predictive performances.
Towards the Application of Backpropagation-Free Graph Convolutional Networks on Huge Datasets
Nicolo Navarin;Luca Pasa;Alessandro Sperduti
2024
Abstract
Backpropagation-Free Graph Convolutional Networks (BFGCN) are backpropagation-free neural models dealing with graph data based on Gated Linear Networks. Each neuron in a BF-GCN is defined as a set of graph convolution filters (weight vectors) and a gating mechanism that, given a node’s context, selects the weight vector to use for processing the node’s attributes based on its distance from a set of prototypes. Given the higher expressivity BF-GNN’s neurons compared to the standard graph convolutional neural networks’ ones, they show bigger memory footprint. In this paper, we explore how reducing the size of node contexts through randomization can reduce the memory occupancy of the method, enabling its application to huge datasets. We empirically show how working with very low dimensional contexts does not impact the resulting predictive performances.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.