The Role of Symmetry in Optimizing Overparameterized Networks

arXiv cs.LG / 4/29/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper studies how weight-space symmetries in neural networks change under overparameterization, shedding light on why overparameterized deep learning optimizes more effectively.
  • It shows that added symmetries function like a form of diagonal preconditioning of the Hessian, which allows better-conditioned minima within equivalence classes of functionally identical solutions.
  • The authors also prove that overparameterization increases the “probability mass” of global minima near typical initializations, making good solutions easier to reach.
  • Teacher–student experiments confirm the theory: increasing network width reduces Hessian trace, improves condition numbers, and speeds up convergence.
  • Overall, the work frames overparameterization and width growth as a geometric transformation of the loss landscape that promotes optimization.

Abstract

Overparameterization is central to the success of deep learning, yet the mechanisms by which it improves optimization remain incompletely understood. We analyze weight-space symmetries in neural networks and show that overparameterization introduces additional symmetries that benefit optimization in two distinct ways. First, we prove that these symmetries act as a form of diagonal preconditioning on the Hessian, enabling the existence of better-conditioned minima within each equivalence class of functionally identical solutions. Second, we show that overparameterization increases the probability mass of global minima near typical initializations, making these favorable solutions more reachable. Teacher-student network experiments validate our theoretical predictions: as width increases, the Hessian trace decreases, condition numbers improve, and convergence accelerates. Our analysis provides a unified framework for understanding overparameterization and width growth as a geometric transformation of the loss landscape.