PINNs in More General Geometry

arXiv cs.LG / 4/29/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • Physics-informed neural networks (PINNs) use loss functions derived from differential equations or differential conditions to turn geometric/physical constraints into an optimization objective.
  • The article argues that many tasks in differential geometry can be formulated as minimization of a differential functional, enabling direct mapping of geometric problem-solving into AI loss minimization.
  • It presents guiding principles for designing PINN architectures for more general geometry settings and explains why this approach is a good fit.
  • The work includes summaries of three related studies that sit at the intersection of PINNs and computational string geometry.

Abstract

Neural architectures trained with losses inspired by differential conditions are the basis for PINN models. Since many constructions in differential geometry may be framed as minimisation of a differential functional, these functionals can be coded as loss functions to align the AI loss-minimisation goal with that of solving the geometric problem. This contribution to the Recent Progress in Computational String Geometry workshop proceedings introduces the PINN architecture defining principles, motivates how they are well suited for problems in differential geometry, and demonstrates their use via summaries of three works at this intersection.