AI Navigate

On-Average Stability of Multipass Preconditioned SGD and Effective Dimension

arXiv cs.LG / 3/13/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper studies how the geometry of population risk curvature, gradient noise, and preconditioning jointly affect generalization in multipass PSGD.
  • It shows that when these geometries do not align, an aggressive modification can improve one aspect while worsening another, leading to suboptimal statistical behavior.
  • It introduces a new on-average stability analysis for multipass SGD that accounts for correlations from data reuse and connects generalization to an effective dimension.
  • It derives excess risk bounds that depend on the effective dimension and demonstrates that a poorly chosen preconditioner can harm both optimization and generalization.
  • It provides matching instance-dependent lower bounds to complement the upper bounds, highlighting a tight characterization of the trade-offs.

Abstract

We study trade-offs between the population risk curvature, geometry of the noise, and preconditioning on the generalisation ability of the multipass Preconditioned Stochastic Gradient Descent (PSGD). Many practical optimisation heuristics implicitly navigate this trade-off in different ways -- for instance, some aim to whiten gradient noise, while others aim to align updates with expected loss curvature. When the geometry of the population risk curvature and the geometry of the gradient noise do not match, an aggressive choice that improves one aspect can amplify instability along the other, leading to suboptimal statistical behavior. In this paper we employ on-average algorithmic stability to connect generalisation of PSGD to the effective dimension that depends on these sources of curvature. While existing techniques for on-average stability of SGD are limited to a single pass, as first contribution we develop a new on-average stability analysis for multipass SGD that handles the correlations induced by data reuse. This allows us to derive excess risk bounds that depend on the effective dimension. In particular, we show that an improperly chosen preconditioner can yield suboptimal effective dimension dependence in both optimisation and generalisation. Finally, we complement our upper bounds with matching, instance-dependent lower bounds.