Towards Scalable Lifelong Knowledge Editing with Selective Knowledge Suppression

arXiv cs.AI / 4/22/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper introduces LightEdit, a framework for scalable lifelong knowledge editing that updates specific facts in LLMs without full retraining.
  • It improves edit stability over sequential changes by selecting relevant knowledge from retrieved evidence and then using a decoding strategy to suppress the model’s original (undesired) knowledge probabilities.
  • Existing parameter-editing methods are noted to suffer from instability and catastrophic forgetting during sequential edits, while retrieval-based methods can be limited by high training costs.
  • Experiments on ZSRE, Counterfact, and RIPE show that LightEdit outperforms prior lifelong knowledge editing approaches.
  • By reducing training costs, LightEdit is positioned as a cost-effective approach that can be adapted to different datasets more easily than prior retrieval-heavy methods.

Abstract

Large language models (LLMs) require frequent knowledge updates to reflect changing facts and mitigate hallucinations. To meet this demand, lifelong knowledge editing has emerged as a continual approach to modify specific pieces of knowledge without retraining the entire model. Existing parameter editing methods struggle with stability during sequential edits due to catastrophic forgetting. While retrieval-based approaches are proposed to alleviate this issue, their applicability remains limited across various datasets because of high training costs. To address these limitations and enhance scalability in lifelong settings, we propose LightEdit. Our framework first selects relevant knowledge from retrieved information to modify the query effectively. It then incorporates a decoding strategy to suppress the model's original knowledge probabilities, thereby enabling efficient edits based on the selected information. Extensive experiments on ZSRE, Counterfact, and RIPE benchmarks demonstrate that LightEdit outperforms existing lifelong knowledge editing methods. Furthermore, by minimizing training costs, LightEdit achieves cost-effective scalability, enabling easy adaptation to various datasets.