Inference of Online Newton Methods with Nesterov's Accelerated Sketching

arXiv stat.ML / 4/28/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper targets streaming decision-making by improving uncertainty quantification in online optimization, where existing inference for first-order methods is costly due to covariance matrix updates with O(d^2) time/memory cost.
  • It proposes an online Newton method that uses Hessian averaging and approximates Newton directions via a sketch-and-project solver accelerated with Nesterov’s acceleration, aiming to retain O(d^2) complexity rather than the usual O(d^3) Newton-system bottleneck.
  • The authors analyze uncertainty sources from both random data and randomized computation, proving global almost-sure convergence and establishing asymptotic normality of the last iterate with a limiting covariance defined through a Lyapunov equation.
  • They further provide a fully online covariance estimator with non-asymptotic convergence guarantees and link the resulting uncertainty quantification to that of exact and non-Nesterov-accelerated sketched Newton methods.
  • Experiments on regression models show that the proposed approach yields superior performance for online inference compared with baselines.

Abstract

Reliable decision-making with streaming data requires principled uncertainty quantification of online methods. While first-order methods enable efficient iterate updates, their inference procedures still require updating proper (covariance) matrices, incurring O(d^2) time and memory complexity, and are sensitive to ill-conditioning and noise heterogeneity of the problem. This costly inference task offers an opportunity for more robust second-order methods, which are, however, bottlenecked by solving Newton systems with O(d^3) complexity. In this paper, we address this gap by studying an online Newton method with Hessian averaging, where the Newton direction at each step is approximately computed using a sketch-and-project solver with Nesterov's acceleration, matching O(d^2) complexity of first-order methods. For the proposed method, we quantify its uncertainty arising from both random data and randomized computation. Under standard smoothness and moment conditions, we establish global almost-sure convergence, prove asymptotic normality of the last iterate with a limiting covariance characterized by a Lyapunov equation, and develop a fully online covariance estimator with non-asymptotic convergence guarantees. We also connect the resulting uncertainty quantification to that of exact and sketched Newton methods without Nesterov's acceleration. Extensive experiments on regression models demonstrate the superiority of the proposed method for online inference.