AI Navigate

CLeAN: Continual Learning Adaptive Normalization in Dynamic Environments

arXiv cs.LG / 3/19/2026

📰 NewsModels & Research

Key Points

  • The paper introduces Continual Learning Adaptive Normalization (CLeAN), a normalization technique designed for continual learning in tabular data to cope with changing data distributions.
  • CLeAN uses learnable global feature scales updated via an Exponential Moving Average to adapt normalization without access to the entire dataset.
  • The approach is evaluated on two datasets across several continual learning strategies, including Reservoir Experience Replay, A-GEM, and EwC, showing improved performance on new data and reduced catastrophic forgetting.
  • Findings highlight adaptive normalization as a key factor for stability and knowledge retention in evolving learning environments.

Abstract

Artificial intelligence systems predominantly rely on static data distributions, making them ineffective in dynamic real-world environments, such as cybersecurity, autonomous transportation, or finance, where data shifts frequently. Continual learning offers a potential solution by enabling models to learn from sequential data while retaining prior knowledge. However, a critical and underexplored issue in this domain is data normalization. Conventional normalization methods, such as min-max scaling, presuppose access to the entire dataset, which is incongruent with the sequential nature of continual learning. In this paper we introduce Continual Learning Adaptive Normalization (CLeAN), a novel adaptive normalization technique designed for continual learning in tabular data. CLeAN involves the estimation of global feature scales using learnable parameters that are updated via an Exponential Moving Average (EMA) module, enabling the model to adapt to evolving data distributions. Through comprehensive evaluations on two datasets and various continual learning strategies, including Resevoir Experience Replay, A-GEM, and EwC we demonstrate that CLeAN not only improves model performance on new data but also mitigates catastrophic forgetting. The findings underscore the importance of adaptive normalization in enhancing the stability and effectiveness of tabular data, offering a novel perspective on the use of normalization to preserve knowledge in dynamic learning environments.