AI Navigate

A Family of Adaptive Activation Functions for Mitigating Failure Modes in Physics-Informed Neural Networks

arXiv cs.LG / 3/20/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper introduces a family of adaptive wavelet-based activation functions to mitigate failure modes in Physics-Informed Neural Networks (PINNs).
  • These activations combine trainable wavelets with either trainable or fixed hyperbolic tangent and softplus components to boost training stability and expressive power.
  • Five distinct activation functions are developed and evaluated across four PDE classes, showing improved robustness and accuracy compared with traditional activations.
  • Direct comparisons with baseline PINNs, PINNsFormer, and other deep learning models demonstrate the proposed approach's generality and effectiveness.
  • The work highlights potential broader impact for PINN-based scientific computing by integrating wavelet theory into activation design.

Abstract

Physics-Informed Neural Networks(PINNs) are a powerful and flexible learning framework that has gained significant attention in recent years. It has demonstrated strong performance across a wide range of scientific and engineering problems. In parallel, wavelets have been extensively used as efficient computational tools due to their strong approximation capabilities. Motivated by the common failure modes observed in standard PINNs, this work introduces a novel family of adaptive wavelet-based activation functions. The proposed activation functions significantly improve training stability and expressive power by combining trainable wavelet functions with either trainable or fixed hyperbolic tangent and softplus functions. Five distinct activation functions are developed within the PINN framework and systematically evaluated across four representative classes of partial differential equations (PDEs). Comprehensive comparisons using bar plots demonstrate improved robustness and accuracy compared to traditional activation functions. Furthermore, the proposed approach is validated through direct comparisons with baseline PINNs, transformer-based architectures such as PINNsFormer, and other deep learning models, highlighting its effectiveness and generality.