Gaussian Approximation and Multiplier Bootstrap for Stochastic Gradient Descent

arXiv stat.ML / 5/5/2026

💬 OpinionModels & Research

Key Points

  • The paper proves that the multiplier bootstrap method can construct confidence sets for Stochastic Gradient Descent (SGD) with non-asymptotic guarantees under suitable regularity conditions.
  • The authors avoid approximating the limiting covariance of Polyak–Ruppert SGD iterates, enabling concrete approximation rates.
  • They derive accuracy bounds in convex distance of order up to 1/√n, which can be faster than what is supported by the Polyak–Juditsky central limit theorem.
  • The work is presented as the first fully non-asymptotic error bound for bootstrap approximations in SGD, relying on Gaussian approximation theory for nonlinear statistics.
  • The analysis framework extends known results by connecting bootstrap-based inference in SGD to modern Gaussian approximation techniques for independent random variables.

Abstract

In this paper, we establish the non-asymptotic validity of the multiplier bootstrap procedure for constructing the confidence sets using the Stochastic Gradient Descent (SGD) algorithm. Under appropriate regularity conditions, our approach avoids the need to approximate the limiting covariance of Polyak-Ruppert SGD iterates, which allows us to derive approximation rates in convex distance of order up to 1/\sqrt{n}. Notably, this rate can be faster than the one that can be proven in the Polyak-Juditsky central limit theorem. To our knowledge, this provides the first fully non-asymptotic bound on the accuracy of bootstrap approximations in SGD algorithms. Our analysis builds on the Gaussian approximation results for nonlinear statistics of independent random variables.