SpecMoE: Spectral Mixture-of-Experts Foundation Model for Cross-Species EEG Decoding
arXiv cs.LG / 3/18/2026
📰 NewsSignals & Early TrendsModels & Research
Key Points
- SpecMoE introduces a foundation model for cross-species EEG decoding using a Gaussian-smoothed masking scheme on short-time Fourier transform maps to jointly mask time, frequency, and time–frequency domains.
- The architecture, SpecHi-Net, is a U-shaped hierarchical encoder–decoder that trains three independent expert models on partitioned data and fuses them with a learned spectral gating mechanism via SpecMoE.
- The model achieves state-of-the-art performance across EEG tasks such as sleep staging, emotion recognition, motor imagery classification, abnormal signal detection, and drug effect prediction, with strong cross-species generalization to murine EEG data.
- By mitigating bias toward high-frequency oscillations and enabling robust cross-subject and cross-species decoding, the work has implications for neuroscience research and neurotechnology development.
Related Articles
Do I need different approaches for different types of business information errors?
Dev.to
WordPress Theme Customization Without Code: The AI Revolution
Dev.to
How AI-Powered Revenue Intelligence Transforms B2B Sales Teams
Dev.to
Why Your SaaS Needs AI Chat in 2026 (Add It in 40 Lines)
Dev.to
[D] Matryoshka Representation Learning
Reddit r/MachineLearning