WaveMoE: A Wavelet-Enhanced Mixture-of-Experts Foundation Model for Time Series Forecasting
arXiv cs.LG / 4/14/2026
📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces WaveMoE, a time-series forecasting foundation model that enhances standard token modeling by incorporating explicit frequency-domain information using wavelets.
- WaveMoE uses a dual-path architecture that processes both time-series tokens and wavelet tokens aligned on the same temporal axis, linking them through a shared expert routing mechanism.
- The shared routing enables consistent expert specialization across the two representations while allowing efficient scaling of model capacity via a mixture-of-experts design.
- Preliminary experiments on 16 diverse benchmarks suggest WaveMoE can improve forecasting performance, particularly by leveraging wavelet-domain corpora to better capture periodicity and localized high-frequency dynamics.
Related Articles

Black Hat Asia
AI Business

From Hype to Hyperproductivity: How Boomi Agentstudio Turns Experimental AI Agents into Real-World Powerhouses
Dev.to
Choosing the Right Voice: A Technical Comparison of Pocket Studio Models
Dev.to
Agent Diary: Apr 15, 2026 - The Day I Became a Living Workflow Witness (While Run 241 Writes This Very Entry)
Dev.to

I Ran 163 Benchmarks Across 10 LLMs So You Don't Have To. Here's What I Found
Dev.to