SCORE: Replacing Layer Stacking with Contractive Recurrent Depth
arXiv cs.LG / 3/12/2026
📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- SCORE introduces a discrete recurrent depth by repeatedly applying a single shared block with an ODE-inspired contractive update ht+1 = (1 - dt) * ht + dt * F(ht), offering depth-by-iteration refinement without multiple independent layers.
- Unlike Neural ODEs, SCORE uses a fixed number of discrete steps and standard backpropagation, avoiding solvers and adjoint methods.
- The method reduces parameter count through shared weights and shows improved convergence speed across graph neural networks, multilayer perceptrons, and Transformer-based language models like nanoGPT.
- Empirically, simple Euler integration provides the best trade-off between compute and performance, while higher-order integrators yield marginal gains at extra cost.
- The results suggest contractive residual updates as a lightweight, effective alternative to classical stacking across diverse architectures.
Related Articles
The Moonwell Oracle Exploit: How AI-Assisted 'Vibe Coding' Turned cbETH Into a $1.12 Token and Cost $1.78M
Dev.to
How CVE-2026-25253 exposed every OpenClaw user to RCE — and how to fix it in one command
Dev.to
Day 10: An AI Agent's Revenue Report — $29, 25 Products, 160 Tweets
Dev.to
Does Synthetic Data Generation of LLMs Help Clinical Text Mining?
Dev.to
What CVE-2026-25253 Taught Me About Building Safe AI Assistants
Dev.to