Black-Box Optimization From Small Offline Datasets via Meta Learning with Synthetic Tasks
arXiv cs.LG / 4/15/2026
💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- The paper addresses offline black-box optimization under small, scarce, or low-quality datasets common in scientific design tasks like molecules and materials discovery.
- It identifies a core limitation of prior methods: surrogate models must learn the optimization bias (correctly ranking candidates), but doing so is difficult with limited data.
- The proposed OptBias method uses meta-learning by generating synthetic optimization tasks from a Gaussian process to learn a reusable optimization-bias representation.
- OptBias then fine-tunes the surrogate model on the limited real dataset for the target task, improving performance across both continuous and discrete benchmarks.
- Experimental results show OptBias consistently outperforms state-of-the-art offline optimization baselines specifically in small-data regimes, suggesting practical robustness for realistic offline settings.
Related Articles

Black Hat Asia
AI Business

The Complete Guide to Better Meeting Productivity with AI Note-Taking
Dev.to

5 Ways Real-Time AI Can Boost Your Sales Call Performance
Dev.to

RAG in Practice — Part 4: Chunking, Retrieval, and the Decisions That Break RAG
Dev.to
Why dynamically routing multi-timescale advantages in PPO causes policy collapse (and a simple decoupled fix) [R]
Reddit r/MachineLearning