Layer Embedding Deep Fusion Graph Neural Network

arXiv cs.LG / 4/28/2026

📰 NewsModels & Research

Key Points

  • The paper highlights key limitations of standard Graph Neural Networks (GNNs), including reduced applicability in low-homophily (heterophilic) graphs and difficulty capturing long-range dependencies due to hierarchical diffusion-style message passing.
  • It proposes LEDF-GNN (Layer Embedding Deep Fusion GNN), introducing a Layer Embedding Deep Fusion (LEDF) operator that nonlinearly fuses multi-layer embeddings to better model inter-layer dependencies and reduce deep propagation degradation.
  • To address structural heterophily, the method uses a Dual-Topology Parallel Strategy (DTPS), which jointly leverages the original and reconstructed graph topologies for adaptive structure-semantics co-optimization.
  • Semi-supervised classification experiments on citation and image benchmarks show LEDF-GNN outperforming state-of-the-art baselines across both homophilic and heterophilic scenarios, indicating strong generalization.
  • Overall, the work targets over-smoothing and misaggregation problems that worsen as GNN depth increases on heterophilic graphs by combining representation fusion with topology-aware co-optimization.

Abstract

Graph Neural Networks (GNNs) have demonstrated impressive performance in learning representations from graph-structured data. However, their message-passing mechanism inherently relies on the assumption of label consistency among connected nodes, limiting their applicability to low-homophily settings. Moreover, since message passing operates as a hierarchical diffusion process, GNNs face challenges in capturing long-range dependencies. As network depth increases, the structural noise along heterophilic edges tends to be amplified, resulting in over-smoothing. This issue becomes especially prominent in highly heterophilic graphs, where the propagation of inconsistent semantics across the topology continually exacerbates misaggregation. To address this issue, we propose a novel framework named Layer Embedding Deep Fusion Graph Neural Network (LEDF-GNN). Specifically, we design a Layer Embedding Deep Fusion (LEDF) operator that nonlinearly fuses multi-layer embeddings to capture inter-layer dependencies and effectively alleviate deep propagation degradation. Meanwhile, to mitigate structural heterophily, LEDF-GNN employs a Dual-Topology Parallel Strategy (DTPS) that simultaneously leverages the original and reconstructed topologies, allowing for adaptive structure-semantics co-optimization under diverse homophily conditions. Extensive semi-supervised classification experiments on the citation and image benchmarks demonstrate that, under both homophilic and heterophilic settings, LEDF-GNN consistently outperforms state-of-the-art baselines, validating its effectiveness and generalization capability across diverse graph types.