Query Lower Bounds for Diffusion Sampling

arXiv cs.LG / 4/14/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper studies theoretical limits on accelerating diffusion model sampling by reducing the number of score-function evaluations needed per generated sample.
  • It proves the first score-query lower bounds for diffusion sampling: for d-dimensional targets with polynomial-accuracy score estimates (ε = d^{-O(1)}), any sampling algorithm needs tilde{Ω}(√d) adaptive score queries.
  • The results imply a structural requirement that samplers must effectively search over tilde{Ω}(√d) distinct noise levels.
  • The authors use these lower bounds to formally explain why multiscale noise schedules are necessary and not merely a heuristic choice in practical diffusion samplers.

Abstract

Diffusion models generate samples by iteratively querying learned score estimates. A rapidly growing literature focuses on accelerating sampling by minimizing the number of score evaluations, yet the information-theoretic limits of such acceleration remain unclear. In this work, we establish the first score query lower bounds for diffusion sampling. We prove that for d-dimensional distributions, given access to score estimates with polynomial accuracy \varepsilon=d^{-O(1)} (in any L^p sense), any sampling algorithm requires \widetilde{\Omega}(\sqrt{d}) adaptive score queries. In particular, our proof shows that any sampler must search over \widetilde{\Omega}(\sqrt{d}) distinct noise levels, providing a formal explanation for why multiscale noise schedules are necessary in practice.