Analytic Drift Resister for Non-Exemplar Continual Graph Learning
arXiv cs.AI / 4/6/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- Non-Exemplar Continual Graph Learning (NECGL) avoids privacy risks by storing only class-level prototypes, but it introduces feature drift that can degrade continual learning performance.
- The paper proposes Analytic Drift Resister (ADR), a theoretically grounded NECGL framework that uses iterative backpropagation to adapt beyond the limitations of frozen pre-trained models while improving model plasticity.
- To counter drift caused by parameter updates, it introduces Hierarchical Analytic Merging (HAM), which performs layer-wise linear transformation merging in GNNs using ridge regression.
- The framework further adds Analytic Classifier Reconstruction (ACR) to enable theoretically zero-forgetting class-incremental learning.
- Experiments on four node classification benchmarks show ADR remains competitive with existing state-of-the-art approaches.




