Transferable Physics-Informed Representations via Closed-Form Head Adaptation

arXiv cs.LG / 4/24/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper addresses a key limitation of physics-informed neural networks (PINNs): poor generalization to new PDEs when few or no training examples are available.
  • It introduces Pi-PINN, a transferable learning framework that learns a shared physics-informed embedding and then solves both known and unseen PDE instances using closed-form head adaptation with a least-squares pseudoinverse under PDE constraints.
  • The authors study how data-driven multi-task losses and physics-informed losses interact, offering guidance for designing stronger PINN training objectives.
  • Experiments on Poisson’s, Helmholtz’s, and Burgers’ equations show Pi-PINN produces much faster predictions (100–1000×) and more accurate results than typical baselines (10–100× lower relative error), even with very limited samples for unseen cases.
  • Overall, the work suggests that transferable representations plus closed-form head adaptation can substantially improve PINN efficiency and cross-PDE generalization for scientific and engineering applications.

Abstract

Physics-informed neural networks (PINNs) have garnered significant interest for their potential in solving partial differential equations (PDEs) that govern a wide range of physical phenomena. By incorporating physical laws into the learning process, PINN models have demonstrated the ability to learn physical outcomes reasonably well. However, current PINN approaches struggle to predict or solve new PDEs effectively when there is a lack of training examples, indicating they do not generalize well to unseen problem instances. In this paper, we present a transferable learning approach for PINNs premised on a fast Pseudoinverse PINN framework (Pi-PINN). Pi-PINN learns a transferable physics-informed representation in a shared embedding space and enables rapid solving of both known and unknown PDE instances via closed-form head adaptation using a least-squares-optimal pseudoinverse under PDE constraints. We further investigate the synergies between data-driven multi-task learning loss and physics-informed loss, providing insights into the design of more performant PINNs. We demonstrate the effectiveness of Pi-PINN on various PDE problems, including Poisson's equation, Helmholtz equation, and Burgers' equation, achieving fast and accurate physics-informed solutions without requiring any data for unseen instances. Pi-PINN can produce predictions 100-1000 times faster than a typical PINN, while producing predictions with 10-100 times lower relative error than a typical data-driven model even with only two training samples. Overall, our findings highlight the potential of transferable representations with closed-form head adaptation to enhance the efficiency and generalization of PINNs across PDE families and scientific and engineering applications.