Towards Adaptive Continual Model Merging via Manifold-Aware Expert Evolution
arXiv cs.LG / 4/27/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper addresses limitations of Continual Model Merging (CMM), highlighting a saturation–redundancy dilemma in backbone-centric methods and redundancy/routing bottlenecks in MoE variants.
- It proposes MADE-IT, an adaptive CMM approach that uses manifold-aware expert evolution to manage expert representation diversity while keeping the architecture compact.
- MADE-IT introduces a projection-based subspace affinity metric and a distribution-aware adaptive threshold to decide when and how experts should evolve autonomously.
- It also avoids parameterized gating networks by using a data-free, training-free implicit routing method that activates experts through feature–subspace alignment.
- Experiments reportedly show MADE-IT improves accuracy and robustness over long-horizon and shuffled task sequences, while pruning redundant experts, especially in generic modules and early layers.
Related Articles

Subagents: The Building Block of Agentic AI
Dev.to

DeepSeek-V4 Models Could Change Global AI Race
AI Business

Got OpenAI's privacy filter model running on-device via ExecuTorch
Reddit r/LocalLLaMA

The Agent-Skill Illusion: Why Prompt-Based Control Fails in Multi-Agent Business Consulting Systems
Dev.to

We Built a Voice AI Receptionist in 8 Weeks — Every Decision We Made and Why
Dev.to