Generative models on phase space
arXiv cs.AI / 4/6/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper studies how deep generative models (notably diffusion and flow matching) can learn and sample high-dimensional distributions when data lies on a physically constrained submanifold in embedding space.
- For high-energy physics events represented as relativistic energy-momentum 4-vectors, it argues that approximate enforcement of physical laws can hurt interpretability and reliability.
- It proposes generative models that are constructed to remain, at every sampling step, on the manifold of massless N-particle Lorentz-invariant phase space in the center-of-momentum frame.
- For diffusion models, it shows that the forward “pure noise” endpoint corresponds to a uniform distribution on phase space, offering a principled baseline for analyzing how particle correlations emerge during reverse denoising.
- The authors demonstrate learning of few-particle and many-particle distributions with different singularity structures and position the work for future interpretability studies on simulated jet data.




