Mixture of Sequence: Theme-Aware Mixture-of-Experts for Long-Sequence Recommendation
arXiv cs.AI / 4/25/2026
💬 OpinionDeveloper Stack & InfrastructureIdeas & Deep AnalysisModels & Research
Key Points
- Sequential recommendation has improved CTR prediction, but long sequences are difficult because users’ interests can shift across time and introduce irrelevant or misleading signals.
- The paper analyzes long-session behavior and identifies a pattern called “session hopping,” where interests are stable within sessions but can change drastically across sessions and sometimes reappear later.
- It proposes Mixture of Sequence (MoS), a model-agnostic mixture-of-experts framework that uses theme-aware routing to segment user history into theme-consistent subsequences and filter out misleading information.
- MoS also adds a multi-scale fusion mechanism with three expert types to capture global trends, short-term behaviors, and theme-specific semantic patterns, improving accuracy while reducing computational cost.
- Experiments on recommendation tasks show MoS reaches state-of-the-art performance with fewer FLOPs than other MoE methods, and the code is released on GitHub.
Related Articles
Navigating WooCommerce AI Integrations: Lessons for Agencies & Developers from a Bluehost Conflict
Dev.to

One Day in Shenzhen, Seen Through an AI's Eyes
Dev.to

Underwhelming or underrated? DeepSeek V4 shows “impressive” gains
SCMP Tech

Claude Code: Hooks, Subagents, and Skills — Complete Guide
Dev.to

Finding the Gold: An AI Framework for Highlight Detection
Dev.to