Pruned Adaptation Modules: A Simple yet Strong Baseline for Continual Foundation Models

arXiv cs.LG / 3/24/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper argues that foundation-model-based continual learning (FM-based CIL) has advanced without clear comparisons to strong lightweight baselines, making it hard to judge whether improvements are genuine.
  • It proposes Pruned Adaptation Modules (PAM), which freezes most of a pre-trained ResNet and adds sparse, task-specific adapter layers for continual adaptation.
  • PAM reduces the number of trainable parameters by up to ~5× and the total parameter footprint by up to ~6×, lowering the compute and update cost for continual learning.
  • Experiments across multiple benchmarks show PAM mitigates catastrophic forgetting and outperforms existing state-of-the-art FM-based CIL methods.
  • The authors present PAM as a simple, transparent baseline to bridge the evaluation gap between traditional CIL and foundation-model-based approaches, with code released on GitHub.

Abstract

The continual learning literature has rapidly shifted from traditional class incremental learning (CIL) techniques to foundation model (FM)-based CIL methods without a clear understanding of how these newer approaches compare to strong, lightweight convolutional baselines. This abrupt transition has created a substantial methodological gap, making it difficult to assess whether recent FM-based CIL progress reflects genuine advances or merely the absence of rigorous baselines. To address this gap, we introduce Pruned Adaptation Modules (PAM), a simple yet effective method that freezes the vast majority of the pre-trained ResNet while enabling scalable continual adaptation through sparse task-specific layers. PAM yields up to a ~5x reduction in trainable parameters and a ~6x reduction in total parameters, significantly reducing the cost of continual updates. Across diverse benchmarks, PAM consistently mitigates catastrophic forgetting and outperforms state-of-the-art FM-based CIL approaches. Our findings position PAM as a strong and transparent baseline that helps bridge the gap between traditional and FM-based CIL, guiding future research for a more accurate assessment of true progress in continual adaptation. The code can be found at: https://github.com/ElifCerenGokYildirim/PAM.