Black-box optimization of noisy functions with unknown smoothness
arXiv stat.ML / 5/5/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper addresses black-box optimization of noisy, potentially high-dimensional functions when the local smoothness near a global optimum is unknown.
- It introduces an adaptive algorithm, POO (Parallel Optimistic Optimization), designed to operate effectively without prior smoothness knowledge.
- The results show POO achieves performance close to algorithms that assume smoothness is known, with a quantified finite-time guarantee.
- The authors provide an error bound stating that after n noisy evaluations, POO’s error is at most a factor of sqrt(ln n) worse than the best smoothness-aware methods.
- The method is claimed to apply to a broader class of functions than prior work, including certain “hard-to-optimize” cases characterized precisely by the theory.
Related Articles

Singapore's Fraud Frontier: Why AI Scam Detection Demands Regulatory Precision
Dev.to

How AI is Changing the Way We Code in 2026: The Shift from Syntax to Strategy
Dev.to

13 CLAUDE.md Rules That Make AI Write Modern PHP (Not PHP 5 Resurrected)
Dev.to

MCP annotations are a UX layer, not a security layer
Dev.to
From OOM to 262K Context: Running Qwen3-Coder 30B Locally on 8GB VRAM
Dev.to