On the Complexity of Optimal Graph Rewiring for Oversmoothing and Oversquashing in Graph Neural Networks

arXiv cs.AI / 3/30/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper analyzes how graph neural networks (GNNs) suffer from oversmoothing and oversquashing in deep settings, attributing both to properties of the underlying graph structure.
  • It formalizes mitigation as graph topology optimization problems: oversmoothing is linked to spectral gap, while oversquashing is linked to conductance.
  • The authors prove that exact optimization for either mitigation objective is NP-hard, and the decision versions are NP-complete via reductions from Minimum Bisection.
  • These findings establish theoretical limits on using graph rewiring to optimize GNN performance, supporting reliance on approximation algorithms and heuristics rather than exact solutions.

Abstract

Graph Neural Networks (GNNs) face two fundamental challenges when scaled to deep architectures: oversmoothing, where node representations converge to indistinguishable vectors, and oversquashing, where information from distant nodes fails to propagate through bottlenecks. Both phenomena are intimately tied to the underlying graph structure, raising a natural question: can we optimize the graph topology to mitigate these issues? This paper provides a theoretical investigation of the computational complexity of such graph structure optimization. We formulate oversmoothing and oversquashing mitigation as graph optimization problems based on spectral gap and conductance, respectively. We prove that exact optimization for either problem is NP-hard through reductions from Minimum Bisection, establishing NP-completeness of the decision versions. Our results provide theoretical foundations for understanding the fundamental limits of graph rewiring for GNN optimization and justify the use of approximation algorithms and heuristic methods in practice.