Python library supporting Discrete Variational Formulations and training solutions with Collocation-based Robust Variational Physics Informed Neural Networks (DVF-CRVPINN)

arXiv cs.LG / 4/20/2026

📰 NewsDeveloper Stack & InfrastructureTools & Practical UsageModels & Research

Key Points

  • The paper presents a Python programming environment for solving PDEs using discrete weak formulations, including discrete domains, discrete functions on point sets, and Kronecker-delta-based test functions.
  • It introduces a discrete neural-network representation that predicts solution values on discrete points and uses discrete finite-difference derivatives integrated into automatic differentiation.
  • As a proof-of-concept, the authors train on the 2D Stokes equations by minimizing a discrete weak residual using the Adamax optimizer with discrete automatic differentiation of discrete gradients.
  • The work includes a rigorous mathematical treatment that establishes well-posedness and robustness of the loss function, aiming for robust, numerically controlled training by tying the loss to the true error.
  • The library functionality is also demonstrated on the Laplace equation formulation, showing the approach generalizes beyond the Stokes case.

Abstract

We explore the possibility of solving Partial Differential Equations (PDEs) using discrete weak formulations. We propose a programming environment for defining a discrete computational domain, introducing discrete functions defined over a set of points, constructing discrete inner products, and introducing discrete weak formulations employing Kronecker delta test functions. Building on this setup, we propose a discrete neural network representation, training the solution function defined over a discrete set of points and employing discrete finite difference derivatives in the automatic differentiation procedures. As a challenging computational model example, we focus on Stokes equations in two-dimensions, defined over a discrete set of points. We train the solution using the discrete weak residual and the Adamax algorithm with discrete automatic differentiation of the discrete gradients. Despite introducing the python environment, we also provide a rigorous mathematical formulation based on discrete weak formulations, proving the well-posedness and robustness of the loss function. The solution of the discrete weak formulations is based on neural network training employing a robust loss function that is related to the true error. In this way, we have a robust control of the numerical error during the training of the neural networks. Besides the Stokes formulation, we also explain the functionality of the proposed library using the Laplace problem formulation.