Normalizing Flows with Iterative Denoising
arXiv cs.CV / 4/23/2026
📰 NewsModels & Research
Key Points
- The paper proposes an iterative variant of TARFlow, called iTARFlow, to further advance Normalizing Flow (NF) generative modeling for images.
- Unlike diffusion models, iTARFlow keeps a fully end-to-end, likelihood-based training objective throughout training.
- For sampling, it combines autoregressive generation with an iterative denoising step inspired by diffusion-style procedures.
- Experiments show iTARFlow achieves competitive results on ImageNet at 64, 128, and 256 pixel resolutions, suggesting it can be a strong alternative to other generative approaches.
- The authors also study the artifacts produced by iTARFlow and provide insights aimed at guiding future improvements, with code released on GitHub.
Related Articles

Trajectory Forecasts in Unknown Environments Conditioned on Grid-Based Plans
Dev.to

OpenAI Just Named It Workspace Agents. We Open-Sourced Our Lark Version Six Months Ago
Dev.to

GPT Image 2 Subject-Lock Editing: A Practical Guide to input_fidelity
Dev.to

GPT Image 2 vs DALL-E 3: What Actually Changed in OpenAI's New Image Model
Dev.to

AI Tutor for Science Students — Physics Chemistry Biology Solved by AI
Dev.to