NORACL: Neurogenesis for Oracle-free Resource-Adaptive Continual Learning
arXiv cs.LG / 5/1/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper identifies the stability–plasticity dilemma in continual learning as an architectural issue caused by fixed-capacity networks that cannot know future task count and feature overlap.
- It argues that regularization-based continual learning methods implicitly depend on an “oracle” architecture sized for the unknown future, leading to over-provisioning or eventual loss of plasticity depending on task relationships.
- NORACL is proposed as a neurogenesis-inspired approach that starts from a compact model and grows neurons only when representational and plasticity saturation signals indicate it is needed.
- Experiments compare NORACL with oracle-sized static baselines across different numbers of tasks and task geometry, showing equal or better final accuracy while using fewer parameters.
- The authors provide interpretability and analysis, showing that growth is structured so dissimilar tasks expand earlier feature-extraction layers while overlapping tasks shift growth toward later feature-combination layers, thereby creating fresh capacity for new tasks.
Related Articles

Why Autonomous Coding Agents Keep Failing — And What Actually Works
Dev.to

Why Enterprise AI Pilots Fail
Dev.to

The PDF Feature Nobody Asked For (That I Use Every Day)
Dev.to

How to Fix OpenClaw Tool Calling Issues
Dev.to

Mistral's new flagship Medium 3.5 folds chat, reasoning, and code into one model
THE DECODER