Baguan-TS: A Sequence-Native In-Context Learning Model for Time Series Forecasting with Covariates
arXiv cs.LG / 3/19/2026
📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- Baguan-TS introduces a sequence-native in-context learning framework for time series forecasting by a 3D Transformer that jointly attends to temporal, variable, and context axes.
- It tackles calibration and training stability with a retrieval-based local calibration strategy and mitigates output oversmoothing via a context-overfitting approach.
- On public benchmarks with covariates, Baguan-TS consistently outperforms baselines, achieving higher win rates and significant improvements in both point and probabilistic forecasting.
- Real-world energy datasets demonstrate robustness and substantial forecasting gains, indicating practical potential in energy forecasting and other covariate-rich domains.
Related Articles

Interactive Web Visualization of GPT-2
Reddit r/artificial
Stop Treating AI Interview Fraud Like a Proctoring Problem
Dev.to
[R] Causal self-attention as a probabilistic model over embeddings
Reddit r/MachineLearning
The 5 software development trends that actually matter in 2026 (and what they mean for your startup)
Dev.to
InVideo AI Review: Fast Finished
Dev.to