SOC-ICNN: From Polyhedral to Conic Geometry for Learning Convex Surrogate Functions
arXiv cs.LG / 4/27/2026
📰 NewsModels & Research
Key Points
- The paper argues that classical ReLU-based input convex neural networks (ICNNs) correspond to optimal value functions of linear programming, limiting them to piecewise-linear polyhedral representations.
- It introduces SOC-ICNN, which generalizes the optimization interpretation from LP to second-order cone programming (SOCP) by incorporating positive semidefinite curvature and Euclidean-norm conic primitives.
- The authors prove that SOC-ICNNs strictly expand the representational capacity of ReLU-ICNNs while keeping the forward-pass computational complexity asymptotically unchanged.
- Experiments show SOC-ICNN improves function approximation accuracy and maintains competitive performance on downstream decision-making tasks.
- The implementation is publicly released at the provided GitHub repository.
Related Articles

A beginner's guide to the Gemini-2.5-Flash model by Google on Replicate
Dev.to

Qwen 3.6 27B vs Gemma 4 31B - making Packman game!
Reddit r/LocalLLaMA
Our evaluation of OpenAI's GPT-5.5 cyber capabilities
Simon Willison's Blog

Cuda + ROCm simultaneously with -DGGML_BACKEND_DL=ON !
Reddit r/LocalLLaMA

Final Monster: 32x AMD MI50 32GB at 9.7 t/s (TG) & 264 t/s (PP) with Kimi K2.6
Reddit r/LocalLLaMA