A Unified Approach to Analysis and Design of Denoising Markov Models

arXiv stat.ML / 4/6/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper proposes a rigorous, unified mathematical foundation for “denoising Markov models,” framing both forward (from data to a simple distribution) and backward (for efficient reverse sampling) processes under a common Markovian structure.
  • Using connections to nonequilibrium statistical mechanics and generalized Doob’s h-transform, it derives minimal assumptions that enable explicit construction of the backward generator and a unified variational objective tied to measure transport discrepancy.
  • It generalizes and adapts score-matching-style objectives across different forward dynamics, unifying multiple existing continuous and discrete diffusion formulations under a single framework.
  • The authors provide a systematic design recipe for denoising Markov models driven by arbitrary Lévy-type processes, including new examples using geometric Brownian motion and jump processes to handle complex distributions.
  • Overall, the work aims to clarify how the choice of forward process affects both algorithm design and theoretical analysis, potentially guiding more flexible generative modeling beyond standard diffusion setups.

Abstract

Probabilistic generative models based on measure transport, such as diffusion and flow-based models, are often formulated in the language of Markovian stochastic dynamics, where the choice of the underlying process impacts both algorithmic design choices and theoretical analysis. In this paper, we aim to establish a rigorous mathematical foundation for denoising Markov models, a broad class of generative models that postulate a forward process transitioning from the target distribution to a simple, easy-to-sample distribution, alongside a backward process particularly constructed to enable efficient sampling in the reverse direction. Leveraging deep connections with nonequilibrium statistical mechanics and generalized Doob's h-transform, we propose a minimal set of assumptions that ensure: (1) explicit construction of the backward generator, (2) a unified variational objective directly minimizing the measure transport discrepancy, and (3) adaptations of the classical score-matching approach across diverse dynamics. Our framework unifies existing formulations of continuous and discrete diffusion models, identifies the most general form of denoising Markov models under certain regularity assumptions on forward generators, and provides a systematic recipe for designing denoising Markov models driven by arbitrary L\'evy-type processes. We illustrate the versatility and practical effectiveness of our approach through novel denoising Markov models employing geometric Brownian motion and jump processes as forward dynamics, highlighting the framework's potential flexibility and capability in modeling complex distributions.