Robustness Verification of Polynomial Neural Networks

arXiv stat.ML / 4/20/2026

💬 OpinionDeveloper Stack & InfrastructureIdeas & Deep AnalysisModels & Research

Key Points

  • The paper studies robustness verification for neural networks using metric algebraic geometry, showing that for polynomial neural networks, robustness certification reduces to computing the distance to an algebraic decision boundary.
  • It introduces the Euclidean distance (ED) degree as an intrinsic complexity measure, analyzes the ED discriminant, and further proposes a parameter discriminant that flags parameter values where the ED degree decreases.
  • The authors derive ED-degree formulas for multiple polynomial neural network architectures and characterize the expected number of real critical points in the infinite-width limit.
  • They provide both symbolic elimination methods to compute these geometric quantities and homotopy-continuation approaches aimed at exact robustness certification.
  • Experiments on lightning self-attention modules indicate that their decision boundaries have a strictly smaller ED degree than generic cubic hypersurfaces with the same ambient dimension, suggesting architectural structure can reduce verification complexity.

Abstract

We study robustness verification of neural networks via metric algebraic geometry. For polynomial neural networks, certifying a robustness radius amounts to computing the distance to the algebraic decision boundary. We use the Euclidean distance (ED) degree as an intrinsic measure of the complexity of this problem, analyze the associated ED discriminant, and introduce a parameter discriminant that detects parameter values at which the ED degree drops. We derive formulas for the ED degree for several network architectures and characterize the expected number of real critical points in the infinite-width limit. We develop symbolic elimination methods to compute these quantities and homotopy-continuation methods for exact robustness certification. Finally, experiments on lightning self-attention modules reveal decision boundaries with strictly smaller ED degree than generic cubic hypersurfaces of the same ambient dimension.