DRiffusion: Draft-and-Refine Process Parallelizes Diffusion Models with Ease

arXiv cs.LG / 3/30/2026

📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper introduces DRiffusion, a parallel sampling framework for diffusion models that reduces slow iterative sampling latency for interactive use cases.
  • DRiffusion generates multiple draft states for future timesteps using skip transitions, computes the associated noises in parallel, and then refines outputs with the standard denoising process.
  • Theoretical analysis shows acceleration rates of 1/n (conservative mode) or 2/(n+1) (aggressive mode) using n devices.
  • Experiments report 1.4×–3.7× speedups across multiple diffusion models with minimal quality degradation, with FID/CLIP largely matching and PickScore/HPSv2.1 dropping only slightly.

Abstract

Diffusion models have achieved remarkable success in generating high-fidelity content but suffer from slow, iterative sampling, resulting in high latency that limits their use in interactive applications. We introduce DRiffusion, a parallel sampling framework that parallelizes diffusion inference through a draft-and-refine process. DRiffusion employs skip transitions to generate multiple draft states for future timesteps and computes their corresponding noises in parallel, which are then used in the standard denoising process to produce refined results. Theoretically, our method achieves an acceleration rate of \tfrac{1}{n} or \tfrac{2}{n+1}, depending on whether the conservative or aggressive mode is used, where n denotes the number of devices. Empirically, DRiffusion attains 1.4\times-3.7\times speedup across multiple diffusion models while incur minimal degradation in generation quality: on MS-COCO dataset, both FID and CLIP remain largely on par with those of the original model, while PickScore and HPSv2.1 show only minor average drops of 0.17 and 0.43, respectively. These results verify that DRiffusion delivers substantial acceleration and preserves perceptual quality.