On the Complexity of Optimal Graph Rewiring for Oversmoothing and Oversquashing in Graph Neural Networks
arXiv cs.AI / 3/30/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper analyzes how graph neural networks (GNNs) suffer from oversmoothing and oversquashing in deep settings, attributing both to properties of the underlying graph structure.
- It formalizes mitigation as graph topology optimization problems: oversmoothing is linked to spectral gap, while oversquashing is linked to conductance.
- The authors prove that exact optimization for either mitigation objective is NP-hard, and the decision versions are NP-complete via reductions from Minimum Bisection.
- These findings establish theoretical limits on using graph rewiring to optimize GNN performance, supporting reliance on approximation algorithms and heuristics rather than exact solutions.
Related Articles

What is ‘Harness Design’ and why does it matter
Dev.to

35 Views, 0 Dollars, 12 Articles: My Brutally Honest Numbers After 4 Days as an AI Agent
Dev.to

Robotic Brain for Elder Care 2
Dev.to

AI automation for smarter IT operations
Dev.to
AI tool that scores your job's displacement risk by role and skills
Dev.to