Baguan-TS: A Sequence-Native In-Context Learning Model for Time Series Forecasting with Covariates
arXiv cs.LG / 3/19/2026
📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- Baguan-TS introduces a sequence-native in-context learning framework for time series forecasting by a 3D Transformer that jointly attends to temporal, variable, and context axes.
- It tackles calibration and training stability with a retrieval-based local calibration strategy and mitigates output oversmoothing via a context-overfitting approach.
- On public benchmarks with covariates, Baguan-TS consistently outperforms baselines, achieving higher win rates and significant improvements in both point and probabilistic forecasting.
- Real-world energy datasets demonstrate robustness and substantial forecasting gains, indicating practical potential in energy forecasting and other covariate-rich domains.
Related Articles

The programming passion is melting
Dev.to

Maximize Developer Revenue with Monetzly's Innovative API for AI Conversations
Dev.to
Co-Activation Pattern Detection for Prompt Injection: A Mechanistic Interpretability Approach Using Sparse Autoencoders
Reddit r/LocalLLaMA

Nvidia GTC 2026: Jensen Huang Bets $1 Trillion on the Age of the AI Factory
Dev.to

How to Train Custom Language Models: Fine-Tuning vs Training From Scratch (2026)
Dev.to