SpecMoE: Spectral Mixture-of-Experts Foundation Model for Cross-Species EEG Decoding
arXiv cs.LG / 3/18/2026
📰 NewsSignals & Early TrendsModels & Research
Key Points
- SpecMoE introduces a foundation model for cross-species EEG decoding using a Gaussian-smoothed masking scheme on short-time Fourier transform maps to jointly mask time, frequency, and time–frequency domains.
- The architecture, SpecHi-Net, is a U-shaped hierarchical encoder–decoder that trains three independent expert models on partitioned data and fuses them with a learned spectral gating mechanism via SpecMoE.
- The model achieves state-of-the-art performance across EEG tasks such as sleep staging, emotion recognition, motor imagery classification, abnormal signal detection, and drug effect prediction, with strong cross-species generalization to murine EEG data.
- By mitigating bias toward high-frequency oscillations and enabling robust cross-subject and cross-species decoding, the work has implications for neuroscience research and neurotechnology development.
Related Articles

PearlOS. We gave swarm intelligence a local desktop environment and code control to self-evolve. Has been pretty incredible to see so far. Open source and free if you want your own.
Reddit r/LocalLLaMA

Waymo hits 170 million miles while avoiding serious mayhem
The Verge

The Inference Market Is Consolidating. Agent Payments Are Still Nobody's Problem.
Dev.to
QwenDean-4B | fine-tuned SLM for UIGen; our first attempt, looking for feedback!
Reddit r/LocalLLaMA

Signal’s Creator Is Helping Encrypt Meta AI
Wired