Normalizing Flows with Iterative Denoising

arXiv cs.CV / 4/23/2026

📰 NewsModels & Research

Key Points

  • The paper proposes an iterative variant of TARFlow, called iTARFlow, to further advance Normalizing Flow (NF) generative modeling for images.
  • Unlike diffusion models, iTARFlow keeps a fully end-to-end, likelihood-based training objective throughout training.
  • For sampling, it combines autoregressive generation with an iterative denoising step inspired by diffusion-style procedures.
  • Experiments show iTARFlow achieves competitive results on ImageNet at 64, 128, and 256 pixel resolutions, suggesting it can be a strong alternative to other generative approaches.
  • The authors also study the artifacts produced by iTARFlow and provide insights aimed at guiding future improvements, with code released on GitHub.

Abstract

Normalizing Flows (NFs) are a classical family of likelihood-based methods that have received revived attention. Recent efforts such as TARFlow have shown that NFs are capable of achieving promising performance on image modeling tasks, making them viable alternatives to other methods such as diffusion models. In this work, we further advance the state of Normalizing Flow generative models by introducing iterative TARFlow (iTARFlow). Unlike diffusion models, iTARFlow maintains a fully end-to-end, likelihood-based objective during training. During sampling, it performs autoregressive generation followed by an iterative denoising procedure inspired by diffusion-style methods. Through extensive experiments, we show that iTARFlow achieves competitive performance across ImageNet resolutions of 64, 128, and 256 pixels, demonstrating its potential as a strong generative model and advancing the frontier of Normalizing Flows. In addition, we analyze the characteristic artifacts produced by iTARFlow, offering insights that may shed light on future improvements. Code is available at https://github.com/apple/ml-itarflow.