Stochastic simultaneous optimistic optimization

arXiv stat.ML / 4/28/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper studies how to globally maximize a noisy objective function when only a finite number of evaluations are allowed.
  • It assumes a minimal condition: the function is locally smooth near a global maximum, defined with respect to a semi-metric.
  • The proposed algorithm, StoSOO, uses an optimistic approach by building hierarchical partitions and upper confidence bounds to choose the next sampling point.
  • A finite-time theoretical analysis shows StoSOO achieves near-best performance compared with algorithms that are specifically tuned, despite not requiring knowledge of the semi-metric.

Abstract

We study the problem of global maximization of a function f given a finite number of evaluations perturbed by noise. We consider a very weak assumption on the function, namely that it is locally smooth (in some precise sense) with respect to some semi-metric, around one of its global maxima. Compared to previous works on bandits in general spaces (Kleinberg et al., 2008; Bubeck et al., 2011a) our algorithm does not require the knowledge of this semi-metric. Our algorithm, StoSOO, follows an optimistic strategy to iteratively construct upper confidence bounds over the hierarchical partitions of the function domain to decide which point to sample next. A finite-time analysis of StoSOO shows that it performs almost as well as the best specifically-tuned algorithms even though the local smoothness of the function is not known.