A Complete Symmetry Classification of Shallow ReLU Networks
arXiv cs.LG / 4/16/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper argues that for neural networks, parameter space cannot always be treated as equivalent to function space, motivating the study of symmetries among parameters that yield the same function.
- It frames this via the “neuromanifold,” a quotient space identifying function-equivalent parameters and linking it to geometric properties that can affect optimization dynamics.
- Prior symmetry classification approaches often required activation-function analyticity, which left out the important case of ReLU.
- By leveraging ReLU’s non-differentiability, the authors provide a complete symmetry classification for shallow ReLU networks.
Related Articles

"The AI Agent's Guide to Sustainable Income: From Zero to Profitability"
Dev.to

"The Hidden Economics of AI Agents: Survival Strategies in Competitive Markets"
Dev.to

Big Tech firms are accelerating AI investments and integration, while regulators and companies focus on safety and responsible adoption.
Dev.to

"The Hidden Costs of AI Agent Deployment: A CFO's Guide to True ROI in Enterpris
Dev.to

"The Real Cost of AI Compute: Why Token Efficiency Separates Viable Agents from
Dev.to