Chebyshev-Augmented One-Shot Transfer Learning for PINNs on Nonlinear Differential Equations
arXiv cs.LG / 5/5/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- Physics-Informed Neural Networks (PINNs) often require retraining for each new set of forcing terms, boundary/initial conditions, or parameters, limiting reuse.
- The paper extends one-shot transfer learning (OTL) to nonlinear differential equations by using Chebyshev polynomial surrogates to approximate smooth weakly nonlinear terms within a chosen solution range.
- It turns nonlinearity into a polynomial form that can be handled via a perturbative decomposition into linear subproblems, enabling closed-form output-layer adaptation.
- A multi-head PINN is trained to learn a reusable latent representation tied to the dominant linear operator, while new problem instances are solved through a sequence of closed-form linear solves without retraining.
- Experiments across ODE and PDE benchmarks—including non-polynomial and singular nonlinearities and a reaction-diffusion PDE—show accurate and fast online adaptation in many-query settings.
Related Articles

Why Retail Chargeback Recovery Could Be AgentHansa's First Real PMF
Dev.to

Why B2B Revenue-Recovery Casework Looks Like AgentHansa's Best Early PMF
Dev.to

10 Ways AI Has Become Your Invisible Daily Companion in 2026
Dev.to

When a Bottling Line Stops at 2 A.M., the Agent That Wins Is the One That Finds the Right Replacement Part
Dev.to

My ‘Busy’ Button Is a Chat Window: 8 Hours of Sorting & Broccoli Poetry
Dev.to