Information-Theoretic Generalization Bounds for Stochastic Gradient Descent with Predictable Virtual Noise

arXiv cs.LG / 5/4/2026

💬 OpinionModels & Research

Key Points

  • The paper derives information-theoretic generalization bounds for stochastic gradient descent (SGD) by relating expected generalization error to mutual information between learned parameters and training data.
  • It improves prior “virtual noise” proof techniques by introducing predictable, history-adaptive virtual perturbations whose covariance can depend on past SGD history while remaining independent of current/future randomness.
  • The new bounds use conditional relative-entropy/conditional Gaussian arguments, replacing fixed sensitivity and deviation terms with conditional adaptive versions and adding an output-sensitivity penalty from accumulated covariance.
  • When adaptive covariance is data-dependent, the authors decouple local Gaussian smoothing from a global comparison to a reference kernel, adding a KL-based “covariance-comparison” cost for using an admissible but different reference geometry.
  • Under certain admissible covariance synchronization rules, the framework recovers existing fixed-noise-style bounds, while also extending virtual perturbation analysis to SGD settings with path-dependent geometries without changing the SGD algorithm.

Abstract

Information-theoretic generalization bounds analyze stochastic optimization by relating expected generalization error to the mutual information between learned parameters and training data. Virtual perturbation analyses of SGD add auxiliary Gaussian noise only in the proof, making mutual information tractable while leaving the actual SGD trajectory unchanged. Existing bounds, however, typically require perturbation covariances to be fixed independently of the optimization history, limiting their ability to represent geometries induced by moving gradient statistics, preconditioners, curvature proxies, and other pathwise information. We introduce predictable history-adaptive virtual perturbations, where the perturbation covariance at each iteration may depend on the past real SGD history but not on current or future randomness. This predictability enables a conditional Gaussian relative-entropy argument and yields generalization bounds for SGD with adaptive virtual-noise geometry. The bounds replace fixed sensitivity and gradient-deviation terms with conditional adaptive counterparts, include an output-sensitivity penalty from accumulated perturbation covariance, and reduce the deviation term to a conditional variance only under conditional unbiasedness. Since adaptive covariances may be data-dependent, we separate local Gaussian smoothing from global reference-kernel comparison. The resulting bound includes a covariance-comparison cost measuring the KL price of using an admissible reference geometry different from the actual adaptive covariance. Fixed-noise-style bounds are recovered under admissible synchronization, such as deterministic, public, or prefix-observable covariance rules. The framework recovers fixed isotropic and geometry-aware bounds as special cases while extending virtual perturbation analysis to history-dependent SGD without modifying the algorithm.