DIB-OD: Preserving the Invariant Core for Robust Heterogeneous Graph Adaptation via Decoupled Information Bottleneck and Online Distillation

arXiv cs.LG / 2026/4/14

📰 ニュースIdeas & Deep AnalysisModels & Research

要点

  • The paper introduces DIB-OD, a new framework for robust adaptation of Graph Neural Networks (GNNs) across heterogeneous domains that suffer from severe distribution shifts.
  • DIB-OD explicitly decomposes learned representations into orthogonal invariant (task-relevant, domain-transcending) and redundant (domain-specific noise) subspaces to reduce negative transfer and catastrophic forgetting.
  • It uses an Information Bottleneck teacher–student distillation setup and the Hilbert–Schmidt Independence Criterion to isolate and preserve a stable invariant core during transfer.
  • A self-adaptive semantic regularizer is proposed to prevent core corruption in the target domain by dynamically gating the influence of labels based on predictive confidence.
  • Experiments on chemical, biological, and social network datasets show DIB-OD outperforming prior methods, especially on difficult inter-type domain transfers, with improved generalization and anti-forgetting behavior.

Abstract

Graph Neural Network pretraining is pivotal for leveraging unlabeled graph data. However, generalizing across heterogeneous domains remains a major challenge due to severe distribution shifts. Existing methods primarily focus on intra-domain patterns, failing to disentangle task-relevant invariant knowledge from domain-specific redundant noise, leading to negative transfer and catastrophic forgetting. To this end, we propose DIB-OD, a novel framework designed to preserve the invariant core for robust heterogeneous graph adaptation through a Decoupled Information Bottleneck and Online Distillation framework. Our core innovation is the explicit decomposition of representations into orthogonal invariant and redundant subspaces. By utilizing an Information Bottleneck teacher-student distillation mechanism and the Hilbert-Schmidt Independence Criterion, we isolate a stable invariant core that transcends domain boundaries. Furthermore, a self-adaptive semantic regularizer is introduced to protect this core from corruption during target-domain adaptation by dynamically gating label influence based on predictive confidence. Extensive experiments across chemical, biological, and social network domains demonstrate that DIB-OD significantly outperforms state-of-the-art methods, particularly in challenging inter-type domain transfers, showcasing superior generalization and anti-forgetting performance.