WGFINNs: Weak formulation-based GENERIC formalism informed neural networks'

arXiv cs.LG / 4/6/2026

💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper introduces WGFINNs, a weak-formulation extension of GENERIC-formalism informed neural networks designed to improve scientific equation discovery from noisy observations.
  • Unlike prior strong-form approaches that are highly sensitive to measurement noise, WGFINNs retain exact satisfaction of GENERIC degeneracy and symmetry conditions while boosting robustness.
  • The method adds a state-wise weighted loss and a residual-based attention mechanism to address scale imbalance across state variables.
  • Theoretical results show the strong-form estimator can diverge as the time step decreases under noise, while the weak-form estimator remains accurate under specific test-function conditions.
  • Numerical experiments indicate WGFINNs outperform baseline GFINNs across a range of noise levels, yielding more accurate predictions and more reliable recovery of physical quantities.

Abstract

Data-driven discovery of governing equations from noisy observations remains a fundamental challenge in scientific machine learning. While GENERIC formalism informed neural networks (GFINNs) provide a principled framework that enforces the laws of thermodynamics by construction, their reliance on strong-form loss formulations makes them highly sensitive to measurement noise. To address this limitation, we propose weak formulation-based GENERIC formalism informed neural networks (WGFINNs), which integrate the weak formulation of dynamical systems with the structure-preserving architecture of GFINNs. WGFINNs significantly enhance robustness to noisy data while retaining exact satisfaction of GENERIC degeneracy and symmetry conditions. We further incorporate a state-wise weighted loss and a residual-based attention mechanism to mitigate scale imbalance across state variables. Theoretical analysis contrasts quantitative differences between the strong-form and the weak-form estimators. Mainly, the strong-form estimator diverges as the time step decreases in the presence of noise, while the weak-form estimator can be accurate even with noisy data if test functions satisfy certain conditions. Numerical experiments demonstrate that WGFINNs consistently outperform GFINNs at varying noise levels, achieving more accurate predictions and reliable recovery of physical quantities.