EnTransformer: A Deep Generative Transformer for Multivariate Probabilistic Forecasting
arXiv cs.LG / 3/13/2026
📰 NewsModels & Research
Key Points
- EnTransformer introduces a deep generative forecasting framework that combines engression with Transformer-based sequence modeling to learn conditional distributions without restrictive parametric assumptions.
- The approach injects stochastic noise into model representations and optimizes an energy-based scoring objective to directly learn the conditional predictive distribution for multivariate time series.
- It enables generating coherent multivariate forecast trajectories while preserving Transformers' ability to model long-range temporal dependencies and cross-series interactions.
- The authors evaluate EnTransformer on benchmarks including Electricity, Traffic, Solar, Taxi, KDD-cup, and Wikipedia, showing well-calibrated probabilistic forecasts and superior performance over benchmark models.
- This work advances reliable uncertainty quantification in multivariate probabilistic forecasting for domains such as energy systems and transportation networks.
Related Articles
Data Augmentation Using GANs
Dev.to
Speculative Policy Orchestration: A Latency-Resilient Framework for Cloud-Robotic Manipulation
arXiv cs.RO
Automatic Debiased Machine Learning for Smooth Functionals of Nonparametric M-Estimands
arXiv stat.ML
Preference-Guided Debiasing for No-Reference Enhancement Image Quality Assessment
arXiv cs.CV
Model Selection and Parameter Estimation of Multi-dimensional Gaussian Mixture Model
arXiv stat.ML