AI Navigate

Elastic Weight Consolidation Done Right for Continual Learning

arXiv cs.LG / 3/20/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper analyzes Elastic Weight Consolidation (EWC) and identifies that its reliance on the Fisher Information Matrix (FIM) can cause gradient vanishing and inaccurate importance estimation in certain scenarios.
  • It also shows that Memory Aware Synapses (MAS), a variant of EWC, imposes redundant protection on parameters irrelevant to prior tasks, leading to suboptimal performance.
  • The authors introduce Logits Reversal (LR), a simple modification that reverses logit values during FIM calculation to correct importance estimation and reduce gradient vanishing.
  • The proposed approach, named EWC-DR (EWC Done Right), significantly outperforms existing EWC variants across multiple continual learning tasks and datasets.
  • The work positions EWC-DR as an effective, lightweight improvement to established regularization-based continual learning methods.

Abstract

Weight regularization methods in continual learning (CL) alleviate catastrophic forgetting by assessing and penalizing changes to important model weights. Elastic Weight Consolidation (EWC) is a foundational and widely used approach within this framework that estimates weight importance based on gradients. However, it has consistently shown suboptimal performance. In this paper, we conduct a systematic analysis of importance estimation in EWC from a gradient-based perspective. For the first time, we find that EWC's reliance on the Fisher Information Matrix (FIM) results in gradient vanishing and inaccurate importance estimation in certain scenarios. Our analysis also reveals that Memory Aware Synapses (MAS), a variant of EWC, imposes unnecessary constraints on parameters irrelevant to prior tasks, termed the redundant protection. Consequently, both EWC and its variants exhibit fundamental misalignments in estimating weight importance, leading to inferior performance. To tackle these issues, we propose the Logits Reversal (LR) operation, a simple yet effective modification that rectifies EWC's importance estimation. Specifically, reversing the logit values during the calculation of FIM can effectively prevent both gradient vanishing and redundant protection. Extensive experiments across various CL tasks and datasets show that the proposed method significantly outperforms existing EWC and its variants. Therefore, we refer to it as EWC Done Right (EWC-DR).