SubFlow: Sub-mode Conditioned Flow Matching for Diverse One-Step Generation
arXiv cs.LG / 4/15/2026
📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- The paper identifies that existing one-step flow-matching models can suffer severe diversity degradation because MSE-trained class-conditional flows effectively learn a frequency-weighted mean over intra-class sub-modes, averaging out rare but valid variations.
- It proposes SubFlow (Sub-mode Conditioned Flow Matching), which decomposes each class into fine-grained semantic sub-modes via clustering and conditions the flow on sub-mode indices to avoid “averaging distortion.”
- By making each conditioned sub-distribution approximately unimodal, SubFlow targets individual modes more accurately and restores fuller mode coverage even in a single inference step.
- SubFlow is designed to be plug-and-play, integrating into existing one-step frameworks like MeanFlow and Shortcut Models without architectural changes.
- Experiments on ImageNet-256 show improved generation diversity (higher Recall) while keeping competitive image quality (FID), demonstrating broad compatibility across one-step generation approaches.
💡 Insights using this article
This article is featured in our daily AI news digest — key takeaways and action items at a glance.
Related Articles

Black Hat Asia
AI Business

The Complete Guide to Better Meeting Productivity with AI Note-Taking
Dev.to

5 Ways Real-Time AI Can Boost Your Sales Call Performance
Dev.to

RAG in Practice — Part 4: Chunking, Retrieval, and the Decisions That Break RAG
Dev.to
Why dynamically routing multi-timescale advantages in PPO causes policy collapse (and a simple decoupled fix) [R]
Reddit r/MachineLearning