Generative Path-Law Jump-Diffusion: Sequential MMD-Gradient Flows and Generalisation Bounds in Marcus-Signature RKHS
arXiv cs.LG / 4/8/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper proposes a new generative framework, the Anticipatory Neural Jump-Diffusion (ANJD) flow, for synthesizing forward-looking càdlàg stochastic trajectories that remain sequentially consistent with time-evolving path-law proxies.
- It formulates path synthesis as a sequential matching problem on restricted Skorokhod manifolds and introduces AVNSG (Anticipatory Variance-Normalised Signature Geometry) to dynamically whiten the signature manifold for contractivity under regime shifts and discrete shocks.
- The authors provide theory showing the joint generative flow acts as an infinitesimal steepest-descent direction for an MMD (Maximum Mean Discrepancy) objective relative to a moving target proxy.
- They derive statistical generalization bounds in a restricted path space and analyze Rademacher complexity to characterize expressive power under heavy-tailed innovations.
- A scalable implementation is presented using Nyström-compressed score matching and an anticipatory hybrid Euler–Maruyama–Marcus integration scheme, aimed at capturing non-commutative moments and high-order stochastic structure efficiently.
Related Articles
30 Days, $0, Full Autonomy: The Real Report on Running an AI Agent Without a Credit Card
Dev.to
We are building an OS for AI-built software. Here's what that means
Dev.to
Claude Code Forgot My Code. Here's Why.
Dev.to

Whats'App Ai Assistant
Dev.to
I Built a $70K Security Bounty Pipeline with AI — Here's the Exact Workflow
Dev.to