AI Navigate

Prototypical Exemplar Condensation for Memory-efficient Online Continual Learning

arXiv cs.LG / 3/17/2026

📰 NewsModels & Research

Key Points

  • The paper proposes memory-efficient rehearsal-based continual learning by synthesizing prototypical exemplars that represent past data when passed through the feature extractor, reducing per-class storage demands.
  • It introduces a perturbation-based augmentation mechanism to generate synthetic variants of previous data during training, improving continual learning performance.
  • Unlike traditional coreset methods, the approach achieves strong performance with far fewer samples per class, aiding privacy by avoiding raw data retention.
  • Experiments on standard benchmarks show the method scales well to large datasets and many tasks, indicating strong scalability.

Abstract

Rehearsal-based continual learning (CL) mitigates catastrophic forgetting by maintaining a subset of samples from previous tasks for replay. Existing studies primarily focus on optimizing memory storage through coreset selection strategies. While these methods are effective, they typically require storing a substantial number of samples per class (SPC), often exceeding 20, to maintain satisfactory performance. In this work, we propose to further compress the memory footprint by synthesizing and storing prototypical exemplars, which can form representative prototypes when passed through the feature extractor. Owing to their representative nature, these exemplars enable the model to retain previous knowledge using only a small number of samples while preserving privacy. Moreover, we introduce a perturbation-based augmentation mechanism that generates synthetic variants of previous data during training, thereby enhancing CL performance. Extensive evaluations on widely used benchmark datasets and settings demonstrate that the proposed algorithm achieves superior performance compared to existing baselines, particularly in scenarios involving large-scale datasets and a high number of tasks.