Manifold-Optimal Guidance: A Unified Riemannian Control View of Diffusion Guidance
arXiv cs.CV / 3/13/2026
📰 NewsModels & Research
Key Points
- MOG identifies a geometric mismatch in classifier-free guidance that can cause sampling trajectories to drift off the data manifold.
- It reframes diffusion guidance as a local optimal control problem and delivers a closed-form, geometry-aware Riemannian update that corrects off-manifold drift without retraining.
- Auto-MOG introduces a dynamic energy-balancing schedule that adaptively calibrates guidance strength and eliminates the need for manual hyperparameter tuning.
- The approach requires no retraining and imposes virtually no additional computational overhead.
- Extensive validation demonstrates superior fidelity and alignment relative to baselines.
Related Articles

報告:LLMにおける「自己言及的再帰」と「ステートフル・エミュレーション」の観測
note

諸葛亮 孔明老師(ChatGPTのロールプレイ)との対話 その肆拾伍『銀河文明・ダークマターエンジン』
note

GPT-5.4 mini/nano登場!―2倍高速で無料プランも使える小型高性能モデル
note
Why a Perfect-Memory AI Agent Without Persona Drift is Architecturally Impossible
Dev.to
Learning to Reason with Curriculum I: Provable Benefits of Autocurriculum
arXiv cs.LG