IMA-MoE: An Interpretable Modality-Aware Mixture-of-Experts Framework for Characterizing the Neurobiological Signatures of Binge Eating Disorder
arXiv cs.CV / 4/21/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces IMA-MoE, an interpretable modality-aware mixture-of-experts framework aimed at characterizing binge eating disorder (BED) using biological mechanisms rather than symptom-only criteria.
- IMA-MoE integrates heterogeneous multimodal data—neuroimaging, behavioral, hormonal, and demographic measures—by representing each measure as a token to model cross-modal dependencies while maintaining modality-specific information.
- To improve transparency, the method adds a token-importance mechanism that quantifies how much each measure contributes to the model’s BED vs. healthy-control predictions.
- On the large-scale ABCD dataset, IMA-MoE outperforms baseline approaches in distinguishing BED from healthy controls and uncovers sex-specific patterns, with hormonal measures playing a larger predictive role for females.
- Overall, the study suggests that interpretable, data-driven multimodal modeling could support more biologically informed and potentially more personalized interventions for BED and related neuropsychiatric conditions.
Related Articles

Every time a new model comes out, the old one is obsolete of course
Reddit r/LocalLLaMA

We built it during the NVIDIA DGX Spark Full-Stack AI Hackathon — and it ended up winning 1st place overall 🏆
Dev.to

Stop Losing Progress: Setting Up a Pro Jupyter Workflow in VS Code (No More Colab Timeouts!)
Dev.to

Building AgentOS: Why I’m Building the AWS Lambda for Insurance Claims
Dev.to

Where we are. In a year, everything has changed. Kimi - Minimax - Qwen - Gemma - GLM
Reddit r/LocalLLaMA