On the Role of Depth in the Expressivity of RNNs
arXiv cs.LG / 4/3/2026
💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- The paper analyzes how increasing depth affects the expressive power of recurrent neural networks (RNNs), showing that depth boosts memory capacity efficiently relative to parameter count.
- It argues that deeper RNNs enhance expressivity not only by enabling more complex transformations of current inputs but also by improving retention of past information.
- The study extends the theory to 2RNNs, where multiplicative interactions between inputs and hidden states yield polynomial transformations whose maximum degree increases with depth.
- It further demonstrates that, in general, multiplicative interactions in 2RNNs cannot be effectively replaced by simply adding layerwise nonlinearities.
- The authors support the theoretical claims with experiments on both synthetic setups and real-world tasks.
Related Articles

Black Hat Asia
AI Business

90000 Tech Workers Got Fired This Year and Everyone Is Blaming AI but Thats Not the Whole Story
Dev.to

Microsoft’s $10 Billion Japan Bet Shows the Next AI Battleground Is National Infrastructure
Dev.to

TII Releases Falcon Perception: A 0.6B-Parameter Early-Fusion Transformer for Open-Vocabulary Grounding and Segmentation from Natural Language Prompts
MarkTechPost

Portable eye scanner powered by AI expands access to low-cost community screening
Reddit r/artificial