A Physics-Aware Framework for Short-Term GPU Power Forecasting of AI Data Centers
arXiv cs.AI / 5/7/2026
💬 OpinionDeveloper Stack & InfrastructureModels & Research
Key Points
- The paper presents PI-DLinear, a physics-informed time-series model for short-term (5–80 minute) GPU power forecasting in AI data centers with highly variable workloads.
- It models power demand using a multi-node lumped thermal RC network consistent with Newton’s law of cooling, linking power consumption to GPU compute, memory utilization, and temperature via newly derived time-dependent ODEs.
- Trained and evaluated on real AI data center data, PI-DLinear delivers more accurate forecasts than existing state-of-the-art models, including transformer-based and non-transformer approaches.
- The forecasts are also designed to remain physically consistent during events like power throttling and load transients, not just statistically accurate.
- Reported improvements over SOTA span MSE (0.782%–39.08%), MAE (0.993%–51.82%), and RMSE (0.370%–22.28%) across different look-back and prediction windows.
Related Articles

MCP Sentinel v1.0 Is Out: A Lockfile for MCP Tool Schemas
Dev.to

Preserving Color in Neural Artistic Style Transfer
Dev.to

I Built an AI Video Factory That Runs 24/7 — Fully Open Source
Dev.to

Your Agency Doesn’t Have a Productivity Problem It Has a Workflow Problem
Dev.to

How to Use AI Agents in Capacitor App Development
Dev.to