SCORE: Replacing Layer Stacking with Contractive Recurrent Depth
arXiv cs.LG / 3/12/2026
📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- SCORE introduces a discrete recurrent depth by repeatedly applying a single shared block with an ODE-inspired contractive update ht+1 = (1 - dt) * ht + dt * F(ht), offering depth-by-iteration refinement without multiple independent layers.
- Unlike Neural ODEs, SCORE uses a fixed number of discrete steps and standard backpropagation, avoiding solvers and adjoint methods.
- The method reduces parameter count through shared weights and shows improved convergence speed across graph neural networks, multilayer perceptrons, and Transformer-based language models like nanoGPT.
- Empirically, simple Euler integration provides the best trade-off between compute and performance, while higher-order integrators yield marginal gains at extra cost.
- The results suggest contractive residual updates as a lightweight, effective alternative to classical stacking across diverse architectures.
Related Articles

Hey dev.to community – sharing my journey with Prompt Builder, Insta Posts, and practical SEO
Dev.to

How to Build Passive Income with AI in 2026: A Developer's Practical Guide
Dev.to

The Research That Doesn't Exist
Dev.to

ベテランの若手育成負担を減らせ、PLC制御の「ラダー図」をAIで生成
日経XTECH

Jeff Bezos reportedly wants $100 billion to buy and transform old manufacturing firms with AI
TechCrunch