High-accuracy sampling for diffusion models and log-concave distributions

arXiv stat.ML / 4/28/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper introduces new diffusion-model sampling algorithms that achieve δ-accuracy in only polylog(1/δ) steps when given access to sufficiently accurate (≈Õ(δ)-level) score estimates in L^2.
  • The authors claim an exponential improvement over prior diffusion sampling results, including bounds that depend on the data’s intrinsic dimension d_* under minimal data assumptions.
  • Under an additional non-uniform L-Lipschitz condition, the sampling complexity further improves to scale with L rather than the intrinsic dimension.
  • The method also provides the first polylog(1/δ)-complexity sampler for general log-concave distributions using only gradient evaluations, extending the impact beyond diffusion models.
  • Overall, the work advances the theoretical efficiency guarantees for generative modeling by linking sampling speed to score-estimation accuracy and structural properties (intrinsic dimension / Lipschitzness).

Abstract

We present algorithms for diffusion model sampling which obtain \delta-error in \mathrm{polylog}(1/\delta) steps, given access to \widetilde O(\delta)-accurate score estimates in L^2. This is an exponential improvement over all previous results. Specifically, under minimal data assumptions, the complexity is \widetilde O(d_\star \mathrm{polylog}(1/\delta)) where d_\star is the intrinsic dimension of the data. Further, under a non-uniform L-Lipschitz condition, the complexity reduces to \widetilde O(L \mathrm{polylog}(1/\delta)). Our approach also yields the first \mathrm{polylog}(1/\delta) complexity sampler for general log-concave distributions using only gradient evaluations.