On Dominant Manifolds in Reservoir Computing Networks
arXiv cs.LG / 4/8/2026
💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- The paper studies how training recurrent Reservoir Computing (RC) networks shapes the geometry of their dynamics, focusing on the emergence of low-dimensional “dominant manifolds” for temporal forecasting.
- For a simplified linear, continuous-time reservoir model, it derives explicit links between the dimensionality/structure of dominant modes and the intrinsic dimensionality and information content of the training data.
- When training data come from an autonomous dynamical system, the dominant modes are shown to approximate Koopman eigenfunctions of the underlying system.
- The authors clarify a direct relationship between reservoir computing and Dynamic Mode Decomposition (DMD) by connecting dominant reservoir modes to Koopman-based representations.
- They illustrate how eigenvalue motion during training generates dominant manifolds in simulation and discuss extending the framework to nonlinear RC using tangent dynamics and differential p-dominance.




