PowerModelsGAT-AI: Physics-Informed Graph Attention for Multi-System Power Flow with Continual Learning
arXiv cs.LG / 3/19/2026
📰 NewsModels & Research
Key Points
- PowerModelsGAT-AI presents a physics-informed graph attention network for real-time AC power flow that predicts bus voltages and generator injections, addressing slow Newton-Raphson solvers under stressed conditions.
- It employs bus-type-aware masking and learned-weight balancing across multiple loss terms, including a power-mismatch penalty, to handle heterogeneous bus types and objectives.
- The model is evaluated on 14 benchmark systems (4–6,470 buses) with a unified model trained on 13 under N-2 outages, achieving an average NMSE of 0.89% for voltage magnitudes and R^2 > 0.99 for voltage angles.
- In continual learning experiments, experience replay and elastic weight consolidation nearly eliminate forgetting when adapting to a new 1,354-bus system, keeping base-system error increases below 2%.
- Interpretability analyses show attention weights correlate with physical parameters (susceptance r = 0.38; thermal limits r = 0.22), indicating the model captures established power-flow relationships.
Related Articles
[R] Combining Identity Anchors + Permission Hierarchies achieves 100% refusal in abliterated LLMs — system prompt only, no fine-tuning
Reddit r/MachineLearning
[P] Vibecoded on a home PC: building a ~2700 Elo browser-playable neural chess engine with a Karpathy-inspired AI-assisted research loop
Reddit r/MachineLearning
Meet DuckLLM 1.0 My First Model!
Reddit r/LocalLLaMA
Since FastFlowLM added support for Linux, I decided to benchmark all the models they support, here are some results
Reddit r/LocalLLaMA
What measure do I use to compare nested models and non nested models in high dimensional survival analysis [D]
Reddit r/MachineLearning