Parameter Importance is Not Static: Evolving Parameter Isolation for Supervised Fine-Tuning

arXiv cs.LG / 4/16/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper argues that parameter importance during supervised fine-tuning (SFT) is not fixed and can drift over the course of training, undermining “static” parameter-isolation methods.
  • It introduces Evolving Parameter Isolation (EPI), which periodically updates which parameters are isolated by using online, gradient-based estimates rather than freezing a fixed subset.
  • By releasing outdated isolated parameters and protecting newly emerging task-critical ones, EPI aims to balance stability (reducing forgetting) and plasticity (recovering adaptability).
  • Experiments across multiple multi-task benchmarks show EPI reduces task interference and catastrophic forgetting compared with static isolation and standard SFT, while also improving generalization.
  • The work emphasizes that isolation mechanisms should be synchronized with the temporal dynamics of learning when fine-tuning models on diverse abilities.

Abstract

Supervised Fine-Tuning (SFT) of large language models often suffers from task interference and catastrophic forgetting. Recent approaches alleviate this issue by isolating task-critical parameters during training. However, these methods represent a static solution to a dynamic problem, assuming that parameter importance remains fixed once identified. In this work, we empirically demonstrate that parameter importance exhibits temporal drift over the course of training. To address this, we propose Evolving Parameter Isolation (EPI), a fine-tuning framework that adapts isolation decisions based on online estimates of parameter importance. Instead of freezing a fixed subset of parameters, EPI periodically updates isolation masks using gradient-based signals, enabling the model to protect emerging task-critical parameters while releasing outdated ones to recover plasticity. Experiments on diverse multi-task benchmarks demonstrate that EPI consistently reduces interference and forgetting compared to static isolation and standard fine-tuning, while improving overall generalization. Our analysis highlights the necessity of synchronizing isolation mechanisms with the evolving dynamics of learning diverse abilities.