EnTransformer: A Deep Generative Transformer for Multivariate Probabilistic Forecasting
arXiv cs.LG / 3/13/2026
📰 NewsModels & Research
Key Points
- EnTransformer introduces a deep generative forecasting framework that combines engression with Transformer-based sequence modeling to learn conditional distributions without restrictive parametric assumptions.
- The approach injects stochastic noise into model representations and optimizes an energy-based scoring objective to directly learn the conditional predictive distribution for multivariate time series.
- It enables generating coherent multivariate forecast trajectories while preserving Transformers' ability to model long-range temporal dependencies and cross-series interactions.
- The authors evaluate EnTransformer on benchmarks including Electricity, Traffic, Solar, Taxi, KDD-cup, and Wikipedia, showing well-calibrated probabilistic forecasts and superior performance over benchmark models.
- This work advances reliable uncertainty quantification in multivariate probabilistic forecasting for domains such as energy systems and transportation networks.
Related Articles

Jeff Bezos reportedly wants $100 billion to buy and transform old manufacturing firms with AI
TechCrunch
[R] Weekly digest: arXiv AI security papers translated for practitioners -- Cascade (cross-stack CVE+Rowhammer attacks on compound AI), LAMLAD (dual-LLM adversarial ML, 97% evasion), OpenClaw (4 vuln classes in agent frameworks)
Reddit r/MachineLearning
My Experience with Qwen 3.5 35B
Reddit r/LocalLLaMA

Cursor’s new coding model Composer 2 is here: It beats Claude Opus 4.6 but still trails GPT-5.4
VentureBeat
Qwen 3.5 122B completely falls apart at ~ 100K context
Reddit r/LocalLLaMA