Generative Modeling under Non-Monotonic MAR Missingness via Approximate Wasserstein Gradient Flows
arXiv stat.ML / 4/7/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces FLOWGEM, a principled iterative generative method to produce complete datasets from MAR-missing data without relying on ad-hoc imputation.
- FLOWGEM is designed to minimize the expected KL divergence between the observed data distribution and the distribution of generated samples across different missingness patterns, drawing motivation from convergence results for ignoring maximum likelihood estimation.
- To achieve this optimization, the method uses a discretized particle evolution based on Wasserstein Gradient Flows, with the velocity field approximated via a local linear estimator of the density ratio.
- Experiments including simulation studies and real-data benchmarks indicate FLOWGEM reaches state-of-the-art performance, notably improving results for non-monotonic MAR mechanisms.
- Overall, the work positions FLOWGEM as a theoretically grounded and practically competitive alternative to existing imputation approaches, bridging theory and empirical performance.
Related Articles

30 Days, $0, Full Autonomy: The Real Report on Running an AI Agent Without a Credit Card
Dev.to

We are building an OS for AI-built software. Here's what that means
Dev.to

Claude Code Forgot My Code. Here's Why.
Dev.to

Whats'App Ai Assistant
Dev.to

I Built a $70K Security Bounty Pipeline with AI — Here's the Exact Workflow
Dev.to