PowerModelsGAT-AI: Physics-Informed Graph Attention for Multi-System Power Flow with Continual Learning
arXiv cs.LG / 3/19/2026
📰 NewsModels & Research
Key Points
- PowerModelsGAT-AI presents a physics-informed graph attention network for real-time AC power flow that predicts bus voltages and generator injections, addressing slow Newton-Raphson solvers under stressed conditions.
- It employs bus-type-aware masking and learned-weight balancing across multiple loss terms, including a power-mismatch penalty, to handle heterogeneous bus types and objectives.
- The model is evaluated on 14 benchmark systems (4–6,470 buses) with a unified model trained on 13 under N-2 outages, achieving an average NMSE of 0.89% for voltage magnitudes and R^2 > 0.99 for voltage angles.
- In continual learning experiments, experience replay and elastic weight consolidation nearly eliminate forgetting when adapting to a new 1,354-bus system, keeping base-system error increases below 2%.
- Interpretability analyses show attention weights correlate with physical parameters (susceptance r = 0.38; thermal limits r = 0.22), indicating the model captures established power-flow relationships.
Related Articles
Co-Activation Pattern Detection for Prompt Injection: A Mechanistic Interpretability Approach Using Sparse Autoencoders
Reddit r/LocalLLaMA

How to Train Custom Language Models: Fine-Tuning vs Training From Scratch (2026)
Dev.to

KoboldCpp 1.110 - 3 YR Anniversary Edition, native music gen, qwen3tts voice cloning and more
Reddit r/LocalLLaMA
Qwen3.5 Knowledge density and performance
Reddit r/LocalLLaMA
I think I made the best general use System Prompt for Qwen 3.5 (OpenWebUI + Web search)
Reddit r/LocalLLaMA