Continuous Limits of Coupled Flows in Representation Learning

arXiv cs.LG / 4/21/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper proposes a rigorous theoretical framework for decentralized representation learning by modeling it as a coupled slow-fast dynamical system on Riemannian manifolds.
  • It proves, using measure-theoretic limits, that discrete spatial transitions converge uniformly to an overdamped Langevin SDE on continuous data manifolds.
  • The authors further show that representation weights do not diverge “unconditionally” and instead align strictly with the principal eigenspace of the spatial measure, using stochastic versions of classical dynamical-systems tools.
  • By constructing a joint Lyapunov functional for the fully coupled spatial–parametric flow, the work establishes global dissipativity and argues that orthogonally disentangled, linearly separable features emerge at the stationary limit.
  • Overall, the contribution is a bridge between discrete decentralized algorithms and continuous stochastic analysis, offering a formal baseline for future theory in decentralized representation learning.

Abstract

While modern representation learning relies heavily on global error signals, decentralized algorithms driven by local interactions offer a fundamental distributed alternative. However, the macroscopic convergence properties of these discrete dynamics on continuous data manifolds remain theoretically unresolved, notoriously suffering from parameter explosion. We bridge this gap by formalizing decentralized learning as a coupled slow-fast dynamical system on Riemannian manifolds. First, using measure-theoretic limits, we prove that the discrete spatial transitions converge uniformly to an overdamped Langevin stochastic differential equation. Second, via the It\^o-Poisson resolvent and a stochastic extension of LaSalle's Invariance Principle, we establish that the representation weights unconditionally avoid divergence and align strictly with the principal eigenspace of the spatial measure. Finally, we construct a joint Lyapunov functional for the fully coupled spatial-parametric flow. This proves global dissipativity and demonstrates that orthogonally disentangled, linearly separable features emerge spontaneously at the stationary limit. Our framework bridges discrete algorithms with continuous stochastic analysis, providing a formal theoretical baseline for decentralized representation learning.