Divergence is Uncertainty: A Closed-Form Posterior Covariance for Flow Matching

arXiv cs.LG / 5/5/2026

📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper addresses an open challenge in flow matching generative modeling: how to quantify uncertainty of samples efficiently and accurately.
  • It proves an exact, closed-form “divergence-uncertainty identity” showing that the trace of the posterior covariance over clean data equals the divergence of the learned velocity field, up to a known time-dependent factor and an additive constant.
  • A matrix-level version of the identity is also provided, with both forms depending only on the velocity field’s Jacobian, enabling exact computation without retraining or model changes.
  • For one-step flow-matching generators (e.g., MeanFlow), the identity delivers end-to-end generation uncertainty in a single forward pass, avoiding expensive multi-step variance propagation.
  • Experiments on MNIST demonstrate that the uncertainty maps align with semantically meaningful regions (digit boundaries) and that uncertainty scores correlate with prediction error while using orders of magnitude less compute than ensemble-based or Monte Carlo dropout approaches.

Abstract

Flow matching has become a leading framework for generative modeling, but quantifying the uncertainty of its samples remains an open problem. Existing approaches retrain the model with auxiliary variance heads, maintain costly ensembles, or propagate approximate covariance through many integration steps, trading off training cost, inference cost, or accuracy. We show that none of these trade-offs is necessary. We prove that, for any pre-trained flow matching velocity field, the trace of the posterior covariance over the clean data given the current state equals, in closed form, the divergence of the velocity field, up to a known time-dependent prefactor and an additive constant. We call this the \emph{divergence-uncertainty identity} for flow matching. The matrix-level form of the identity is similarly closed-form, depending solely on the velocity Jacobian. Because the identity is exact and post-hoc, it is computable on any pre-trained flow matching model, with no retraining and no architectural modification. For one-step generators such as MeanFlow, the same identity yields the exact end-to-end generation uncertainty in a single forward pass, eliminating the multi-step variance propagation required by all prior methods. Experiments on MNIST confirm that the resulting per-pixel uncertainty maps are semantically meaningful, concentrating on digit boundaries where inter-sample variation is highest, and that the scalar uncertainty score tracks actual prediction error, all at roughly 10,000\times less total compute than ensembling or Monte Carlo dropout.