DIB-OD: Preserving the Invariant Core for Robust Heterogeneous Graph Adaptation via Decoupled Information Bottleneck and Online Distillation
arXiv cs.LG / 2026/4/14
📰 ニュースIdeas & Deep AnalysisModels & Research
要点
- The paper introduces DIB-OD, a new framework for robust adaptation of Graph Neural Networks (GNNs) across heterogeneous domains that suffer from severe distribution shifts.
- DIB-OD explicitly decomposes learned representations into orthogonal invariant (task-relevant, domain-transcending) and redundant (domain-specific noise) subspaces to reduce negative transfer and catastrophic forgetting.
- It uses an Information Bottleneck teacher–student distillation setup and the Hilbert–Schmidt Independence Criterion to isolate and preserve a stable invariant core during transfer.
- A self-adaptive semantic regularizer is proposed to prevent core corruption in the target domain by dynamically gating the influence of labels based on predictive confidence.
- Experiments on chemical, biological, and social network datasets show DIB-OD outperforming prior methods, especially on difficult inter-type domain transfers, with improved generalization and anti-forgetting behavior.



