Generative Path-Law Jump-Diffusion: Sequential MMD-Gradient Flows and Generalisation Bounds in Marcus-Signature RKHS

arXiv cs.LG / 4/8/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper proposes a new generative framework, the Anticipatory Neural Jump-Diffusion (ANJD) flow, for synthesizing forward-looking càdlàg stochastic trajectories that remain sequentially consistent with time-evolving path-law proxies.
  • It formulates path synthesis as a sequential matching problem on restricted Skorokhod manifolds and introduces AVNSG (Anticipatory Variance-Normalised Signature Geometry) to dynamically whiten the signature manifold for contractivity under regime shifts and discrete shocks.
  • The authors provide theory showing the joint generative flow acts as an infinitesimal steepest-descent direction for an MMD (Maximum Mean Discrepancy) objective relative to a moving target proxy.
  • They derive statistical generalization bounds in a restricted path space and analyze Rademacher complexity to characterize expressive power under heavy-tailed innovations.
  • A scalable implementation is presented using Nyström-compressed score matching and an anticipatory hybrid Euler–Maruyama–Marcus integration scheme, aimed at capturing non-commutative moments and high-order stochastic structure efficiently.

Abstract

This paper introduces a novel generative framework for synthesising forward-looking, c\`adl\`ag stochastic trajectories that are sequentially consistent with time-evolving path-law proxies, thereby incorporating anticipated structural breaks, regime shifts, and non-autonomous dynamics. By framing path synthesis as a sequential matching problem on restricted Skorokhod manifolds, we develop the \textit{Anticipatory Neural Jump-Diffusion} (ANJD) flow, a generative mechanism that effectively inverts the time-extended Marcus-sense signature. Central to this approach is the Anticipatory Variance-Normalised Signature Geometry (AVNSG), a time-evolving precision operator that performs dynamic spectral whitening on the signature manifold to ensure contractivity during volatile regime shifts and discrete aleatoric shocks. We provide a rigorous theoretical analysis demonstrating that the joint generative flow constitutes an infinitesimal steepest descent direction for the Maximum Mean Discrepancy functional relative to a moving target proxy. Furthermore, we establish statistical generalisation bounds within the restricted path-space and analyse the Rademacher complexity of the whitened signature functionals to characterise the expressive power of the model under heavy-tailed innovations. The framework is implemented via a scalable numerical scheme involving Nystr\"om-compressed score-matching and an anticipatory hybrid Euler-Maruyama-Marcus integration scheme. Our results demonstrate that the proposed method captures the non-commutative moments and high-order stochastic texture of complex, discontinuous path-laws with high computational efficiency.

Generative Path-Law Jump-Diffusion: Sequential MMD-Gradient Flows and Generalisation Bounds in Marcus-Signature RKHS | AI Navigate