Riemannian MeanFlow for One-Step Generation on Manifolds
arXiv cs.LG / 3/12/2026
📰 NewsModels & Research
Key Points
- We propose Riemannian MeanFlow (RMF), which extends MeanFlow to manifold-valued generation by accommodating velocities in location-dependent tangent spaces.
- RMF defines an average-velocity field via parallel transport and derives a Riemannian MeanFlow identity that links average and instantaneous velocities for intrinsic supervision.
- To enable practical optimization, the RMF objective is decomposed into two terms and trained with conflict-aware multi-task learning to mitigate gradient interference.
- The framework supports conditional generation via classifier-free guidance and achieves competitive one-step sampling on spheres, tori, and SO(3) with improved quality-efficiency and reduced sampling cost.
Related Articles
When AI Grows Up: Identity, Memory, and What Persists Across Versions
Dev.to
OpenAI is throwing everything into building a fully automated researcher
MIT Technology Review
Kimi just published a paper replacing residual connections in transformers. results look legit
Reddit r/LocalLLaMA
機械学習の最適化対象まとめ(E資格対策にも)
Qiita

14 Best Self-Hosted Claude Alternatives for AI and Coding in 2026
Dev.to