Region-Graph Optimal Transport Routing for Mixture-of-Experts Whole-Slide Image Classification
arXiv cs.CV / 4/9/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces ROAM, a spatially aware Mixture-of-Experts (MoE) aggregator for Multiple Instance Learning (MIL) in gigapixel whole-slide image (WSI) classification, aiming to better handle pathological heterogeneity.
- ROAM routes region (spatial) tokens to expert subnetworks using capacity-constrained entropic optimal transport (via Sinkhorn), enforcing balanced expert utilization without relying on extra load-balancing loss terms.
- It additionally adds graph-regularised Sinkhorn iterations that diffuse routing assignments across a spatial region graph, encouraging neighboring regions to route to the same experts coherently.
- Experiments on four WSI benchmarks (using frozen foundation-model patch embeddings) show ROAM is competitive with strong MIL and MoE baselines, and it reports external NSCLC generalisation AUC of 0.845 ± 0.019 on TCGA-CPTAC.
Related Articles

Why Anthropic’s new model has cybersecurity experts rattled
Reddit r/artificial
Does the AI 2027 paper still hold any legitimacy?
Reddit r/artificial

Why Most Productivity Systems Fail (And What to Do Instead)
Dev.to

Moving from proof of concept to production: what we learned with Nometria
Dev.to

Frontend Engineers Are Becoming AI Trainers
Dev.to