Graph Rewiring in GNNs to Mitigate Over-Squashing and Over-Smoothing: A Survey

arXiv cs.LG / 5/5/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • Graph Neural Networks (GNNs) can suffer from over-squashing (distant information being overly compressed) and over-smoothing (node embeddings becoming indistinguishable) due to the interplay between message passing and graph topology.
  • The survey focuses on “graph rewiring” techniques that modify the graph structure to improve information propagation in GNNs.
  • It reviews state-of-the-art rewiring approaches, covering their theoretical motivations as well as practical implementation details.
  • The article emphasizes performance trade-offs, indicating that different rewiring methods may help different aspects of information flow while introducing different costs or limitations.
  • Overall, the work positions graph rewiring as a practical strategy for mitigating core propagation bottlenecks in GNNs.

Abstract

Graph Neural Networks are powerful models for learning from graph-structured data, yet their effectiveness is often limited by two critical challenges: over-squashing, where information from distant nodes is excessively compressed, and over-smoothing, where repeated propagation makes node representations indistinguishable. Both phenomena stem from the interaction between message passing and the input topology, ultimately degrading information flow and limiting the performance of GNNs. In this survey, we examine graph rewiring techniques, a class of methods designed to modify the graph topology to enhance information propagation in GNNs. We provide a comprehensive review of state-of-the-art rewiring approaches, delving into their theoretical underpinnings, practical implementations, and performance trade-offs.