AdaMamba: Adaptive Frequency-Gated Mamba for Long-Term Time Series Forecasting

arXiv cs.AI / 4/28/2026

📰 NewsModels & Research

Key Points

  • The paper presents AdaMamba, a new framework for long-term time series forecasting that combines frequency-domain analysis with Mamba state-space modeling.
  • It addresses real-world cross-domain heterogeneity by learning input-dependent frequency bases and integrating adaptive frequency gating directly into the Mamba update process.
  • AdaMamba uses an interactive patch encoding module to capture inter-variable interaction dynamics and introduces a unified time-frequency forgetting gate to calibrate state transitions.
  • Experiments on seven public LTSF benchmarks and two domain-specific datasets show consistent gains over existing state-of-the-art methods while keeping computational efficiency competitive.
  • The authors provide an open-source implementation via the referenced GitHub repository, enabling replication and further experimentation.

Abstract

Accurate long-term time series forecasting (LTSF) requires the capture of complex long-range dependencies and dynamic periodic patterns. Recent advances in frequency-domain analysis offer a global perspective for uncovering temporal characteristics. However, real-world time series often exhibit pronounced cross-domain heterogeneity where variables that appear synchronized in the time domain can differ substantially in the frequency domain. Existing frequency-based LTSF methods often rely on implicit assumptions of cross-domain homogeneity, which limits their ability to adapt to such intricate variability. To effectively integrate frequency-domain analysis with temporal dependency learning, we propose AdaMamba, a novel framework that endogenizes adaptive and context-aware frequency analysis within the Mamba state-space update process. Specifically, AdaMamba introduces an interactive patch encoding module to capture inter-variable interaction dynamics. Then, we develop an adaptive frequency-gated state-space module that generates input-dependent frequency bases, and generalizes the conventional temporal forgetting gate into a unified time-frequency forgetting gate. This allows dynamic calibration of state transitions based on learned frequency-domain importance, while preserving Mamba's capability in modeling long-range dependencies. Extensive experiments on seven public LTSF benchmarks and two domain-specific datasets demonstrate that AdaMamba consistently outperforms state-of-the-art methods in forecasting accu racy while maintaining competitive computational efficiency. The code of AdaMamba is available at https://github.com/XDjiang25/AdaMamba.