Reservoir-Based Graph Convolutional Networks

arXiv cs.LG / 2026/3/26

📰 ニュースSignals & Early TrendsIdeas & Deep AnalysisModels & Research

要点

  • The paper identifies key limitations of standard GCN message passing on complex or dynamic graph data, including the need for deeper layers to capture long-range dependencies and the resulting over-smoothing and higher compute costs.
  • It proposes RGC-Net, combining reservoir computing dynamics with a structured graph convolution mechanism that uses fixed random reservoir weights and a leaky integrator to better retain features during iterative propagation.
  • The method is presented as a robust approach for graph classification and is also extended into an RGC-Net-powered transformer for graph generation tasks.
  • Experiments reported in the abstract indicate state-of-the-art performance across classification and generative settings, with faster convergence and reduced over-smoothing, including applications to dynamic brain connectivity and graph evolution.
  • The authors provide implementation details via released code at the referenced GitHub repository.

Abstract

Message passing is a core mechanism in Graph Neural Networks (GNNs), enabling the iterative update of node embeddings by aggregating information from neighboring nodes. Graph Convolutional Networks (GCNs) exemplify this approach by adapting convolutional operations for graph structures, allowing features from adjacent nodes to be combined effectively. However, GCNs encounter challenges with complex or dynamic data. Capturing long-range dependencies often requires deeper layers, which not only increase computational costs but also lead to over-smoothing, where node embeddings become indistinguishable. To overcome these challenges, reservoir computing has been integrated into GNNs, leveraging iterative message-passing dynamics for stable information propagation without extensive parameter tuning. Despite its promise, existing reservoir-based models lack structured convolutional mechanisms, limiting their ability to accurately aggregate multi-hop neighborhood information. To address these limitations, we propose RGC-Net (Reservoir-based Graph Convolutional Network), which integrates reservoir dynamics with structured graph convolution. Key contributions include: (i) a reimagined convolutional framework with fixed random reservoir weights and a leaky integrator to enhance feature retention; (ii) a robust, adaptable model for graph classification; and (iii) an RGC-Net-powered transformer for graph generation with application to dynamic brain connectivity. Extensive experiments show that RGC-Net achieves state-of-the-art performance in classification and generative tasks, including brain graph evolution, with faster convergence and reduced over-smoothing. Source code is available at https://github.com/basiralab/RGC-Net .