Black-box optimization of noisy functions with unknown smoothness

arXiv stat.ML / 5/5/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper addresses black-box optimization of noisy, potentially high-dimensional functions when the local smoothness near a global optimum is unknown.
  • It introduces an adaptive algorithm, POO (Parallel Optimistic Optimization), designed to operate effectively without prior smoothness knowledge.
  • The results show POO achieves performance close to algorithms that assume smoothness is known, with a quantified finite-time guarantee.
  • The authors provide an error bound stating that after n noisy evaluations, POO’s error is at most a factor of sqrt(ln n) worse than the best smoothness-aware methods.
  • The method is claimed to apply to a broader class of functions than prior work, including certain “hard-to-optimize” cases characterized precisely by the theory.

Abstract

We study the problem of black-box optimization of a function f of any dimension, given function evaluations perturbed by noise. The function is assumed to be locally smooth around one of its global optima, but this smoothness is unknown. Our contribution is an adaptive optimization algorithm, POO or parallel optimistic optimization, that is able to deal with this setting. POO performs almost as well as the best known algorithms requiring the knowledge of the smoothness. Furthermore, POO works for a larger class of functions than what was previously considered, especially for functions that are difficult to optimize, in a very precise sense. We provide a finite-time analysis of POO's performance, which shows that its error after n evaluations is at most a factor of sqrt(ln n) away from the error of the best known optimization algorithms using the knowledge of the smoothness.