Cross-attentive Cohesive Subgraph Embedding to Mitigate Oversquashing in GNNs

arXiv cs.LG / 3/31/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper addresses oversquashing in graph neural networks, where long-range information gets overly compressed through limited message-passing pathways and global context is distorted.
  • It proposes a new learning framework that enriches node embeddings using cross-attentive cohesive subgraph representations to emphasize useful long-range structure while filtering noisy or irrelevant connections.
  • The method aims to preserve essential global context without overloading the network’s bottlenecked channels, thereby directly mitigating oversquashing.
  • Experiments on multiple benchmark datasets show consistent improvements in node classification accuracy compared with standard baseline approaches.

Abstract

Graph neural networks (GNNs) have achieved strong performance across various real-world domains. Nevertheless, they suffer from oversquashing, where long-range information is distorted as it is compressed through limited message-passing pathways. This bottleneck limits their ability to capture essential global context and decreases their performance, particularly in dense and heterophilic regions of graphs. To address this issue, we propose a novel graph learning framework that enriches node embeddings via cross-attentive cohesive subgraph representations to mitigate the impact of excessive long-range dependencies. This framework enhances the node representation by emphasizing cohesive structure in long-range information but removing noisy or irrelevant connections. It preserves essential global context without overloading the narrow bottlenecked channels, which further mitigates oversquashing. Extensive experiments on multiple benchmark datasets demonstrate that our model achieves consistent improvements in classification accuracy over standard baseline methods.