Project and Generate: Divergence-Free Neural Operators for Incompressible Flows

arXiv cs.LG / 3/26/2026

📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper argues that common learning-based fluid dynamics models can produce physically invalid, unstable flows because they operate in unconstrained function spaces where incompressibility is not enforced.
  • It proposes a unified framework that enforces the incompressible continuity equation as a hard constraint for both deterministic prediction and generative modeling.
  • For deterministic models, it introduces a differentiable spectral Leray projection based on the Helmholtz-Hodge decomposition to restrict outputs to divergence-free velocity fields.
  • For generative models, it shows that post-hoc projection is not enough when the prior is incompatible, so it constructs a divergence-free Gaussian reference measure using a curl-based pushforward to keep probability flows consistent.
  • Experiments on 2D Navier–Stokes show exact incompressibility up to discretization error and improved stability and physical realism versus prior approaches.

Abstract

Learning-based models for fluid dynamics often operate in unconstrained function spaces, leading to physically inadmissible, unstable simulations. While penalty-based methods offer soft regularization, they provide no structural guarantees, resulting in spurious divergence and long-term collapse. In this work, we introduce a unified framework that enforces the incompressible continuity equation as a hard, intrinsic constraint for both deterministic and generative modeling. First, to project deterministic models onto the divergence-free subspace, we integrate a differentiable spectral Leray projection grounded in the Helmholtz-Hodge decomposition, which restricts the regression hypothesis space to physically admissible velocity fields. Second, to generate physically consistent distributions, we show that simply projecting model outputs is insufficient when the prior is incompatible. To address this, we construct a divergence-free Gaussian reference measure via a curl-based pushforward, ensuring the entire probability flow remains subspace-consistent by construction. Experiments on 2D Navier-Stokes equations demonstrate exact incompressibility up to discretization error and substantially improved stability and physical consistency.