Nonmyopic Global Optimisation via Approximate Dynamic Programming
arXiv stat.ML / 3/30/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper addresses global optimization of expensive, gradient-free black-box functions using surrogate-based sequential query strategies, focusing on methods that remain efficient in high-dimensional settings.
- It notes that while Bayesian optimization with Gaussian-process surrogates is effective, GP-based computation becomes prohibitive as dimensionality grows, motivating lighter deterministic surrogates such as IDW and RBF.
- A key contribution is the development of nonmyopic (lookahead) acquisition functions for deterministic surrogate models by using approximate dynamic programming concepts like rollout and multi-step scenario optimization.
- The proposed approach selects a sequence of query points over a horizon by predicting surrogate-model evolution, explicitly improving the exploration–exploitation trade-off beyond greedy (myopic) criteria.
- Experiments on synthetic, hyperparameter tuning benchmarks, a constrained task, and a data-driven predictive control application show faster, more robust convergence compared with conventional myopic strategies.
Related Articles

Mr. Chatterbox is a (weak) Victorian-era ethically trained model you can run on your own computer
Simon Willison's Blog
Beyond the Chatbot: Engineering Multi-Agent Ecosystems in 2026
Dev.to

I missed the "fun" part in software development
Dev.to

The Billion Dollar Tax on AI Agents
Dev.to

Hermes Agent: A Self-Improving AI Agent That Runs Anywhere
Dev.to