SOC-ICNN: From Polyhedral to Conic Geometry for Learning Convex Surrogate Functions

arXiv cs.LG / 4/27/2026

📰 NewsModels & Research

Key Points

  • The paper argues that classical ReLU-based input convex neural networks (ICNNs) correspond to optimal value functions of linear programming, limiting them to piecewise-linear polyhedral representations.
  • It introduces SOC-ICNN, which generalizes the optimization interpretation from LP to second-order cone programming (SOCP) by incorporating positive semidefinite curvature and Euclidean-norm conic primitives.
  • The authors prove that SOC-ICNNs strictly expand the representational capacity of ReLU-ICNNs while keeping the forward-pass computational complexity asymptotically unchanged.
  • Experiments show SOC-ICNN improves function approximation accuracy and maintains competitive performance on downstream decision-making tasks.
  • The implementation is publicly released at the provided GitHub repository.

Abstract

Classical ReLU-based Input Convex Neural Networks (ICNNs) are equivalent to the optimal value functions of Linear Programming (LP). This intrinsic structural equivalence restricts their representational capacity to piecewise-linear polyhedral functions. To overcome this representational bottleneck, we propose the SOC-ICNN, an architecture that generalizes the underlying optimization class from LP to Second-Order Cone Programming (SOCP). By explicitly injecting positive semi-definite curvature and Euclidean norm-based conic primitives, our formulation introduces native smooth curvature into the representation while preserving a rigorous optimization-theoretic interpretation. We formally prove that SOC-ICNNs strictly expand the representational space of ReLU-ICNNs without increasing the asymptotic order of forward-pass complexity. Extensive experiments demonstrate that SOC-ICNN substantially improves function approximation, while delivering competitive downstream decision quality. The code is available at https://github.com/Kanyooo/SOC-ICNN.