CoGR-MoE: Concept-Guided Expert Routing with Consistent Selection and Flexible Reasoning for Visual Question Answering
arXiv cs.CV / 4/21/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces CoGR-MoE, a Mixture-of-Experts framework for Visual Question Answering that aims to stabilize expert routing while keeping reasoning flexible.
- CoGR-MoE uses semantics of the answer options during training to guide expert selection, addressing the inconsistency caused by unstable routing in similar question types.
- After routing, it reweights selected experts using option features to produce discriminative, option-level representations.
- The method leverages these option-level representations for option comparison and further improves them using contrastive learning, achieving strong results across multiple VQA tasks.
Related Articles

Rethinking Coding Education for the AI Era
Dev.to

We Shipped an MVP With Vibe-Coding. Here's What Nobody Tells You About the Aftermath
Dev.to

Agent Package Manager (APM): A DevOps Guide to Reproducible AI Agents
Dev.to

3 Things I Learned Benchmarking Claude, GPT-4o, and Gemini on Real Dev Work
Dev.to

Open Source Contributors Needed for Skillware & Rooms (AI/ML/Python)
Dev.to