Discovering Novel LLM Experts via Task-Capability Coevolution
arXiv cs.AI / 4/17/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper proposes AC/DC, a framework that uses open-ended coevolution of LLMs and tasks to discover increasingly novel skills in a single continuous run.
- AC/DC evolves both components: it updates LLM populations through model merging and expands task diversity by generating natural-language tasks with synthetic data.
- Experiments report that the resulting LLM archives can outperform larger models on capability coverage on downstream benchmarks while using less GPU memory, without any explicit benchmark optimization.
- The authors claim AC/DC’s coverage improves over time and that it performs better in multi-agent best-of-N selection, supporting coevolution as a new paradigm for LLM development.
- The work frames coevolution as a way to accelerate continual diversity improvements by leveraging existing base models as stepping stones toward more capable models.
Related Articles
langchain-anthropic==1.4.1
LangChain Releases

🚀 Anti-Gravity Meets Cloud AI: The Future of Effortless Development
Dev.to

Talk to Your Favorite Game Characters! Mantella Brings AI to Skyrim and Fallout 4 NPCs
Dev.to

AI Will Run Companies. Here's Why That Should Excite You, Not Scare You.
Dev.to

The problem with Big Tech AI pricing (and why 8 countries can't afford to compete)
Dev.to