Adaptive Moments are Surprisingly Effective for Plug-and-Play Diffusion Sampling
arXiv cs.LG / 3/18/2026
📰 NewsModels & Research
Key Points
- The authors propose using adaptive moment estimation to stabilize noisy likelihood scores during guided diffusion sampling.
- The approach is plug-and-play and simple, yet achieves state-of-the-art results on image restoration and class-conditional generation, outperforming more complex and costly methods.
- Empirical analysis on synthetic and real data shows that mitigating gradient noise via adaptive moments improves sampling alignment.
- The work suggests broader applicability and potential efficiency gains for diffusion-based sampling pipelines in practical AI tasks.
Related Articles
Jeff Bezos reportedly wants $100 billion to buy and transform old manufacturing firms with AI
TechCrunch
[R] Weekly digest: arXiv AI security papers translated for practitioners -- Cascade (cross-stack CVE+Rowhammer attacks on compound AI), LAMLAD (dual-LLM adversarial ML, 97% evasion), OpenClaw (4 vuln classes in agent frameworks)
Reddit r/MachineLearning
My Experience with Qwen 3.5 35B
Reddit r/LocalLLaMA

Cursor’s new coding model Composer 2 is here: It beats Claude Opus 4.6 but still trails GPT-5.4
VentureBeat
Qwen 3.5 122B completely falls apart at ~ 100K context
Reddit r/LocalLLaMA