We Still Don't Understand High-Dimensional Bayesian Optimization

arXiv stat.ML / 4/10/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper argues that common high-dimensional Bayesian optimization (BO) approaches—designed around structural assumptions like locality, sparsity, and smoothness—can be outperformed by Bayesian linear regression.
  • It introduces a geometric transformation to prevent undesirable boundary-seeking, enabling Gaussian processes with linear kernels to match state-of-the-art performance across search spaces ranging from 60 to 6,000 dimensions.
  • The authors highlight practical benefits of the linear-kernel/linear-regression approach, including closed-form sampling and computation that scales linearly with the number of observations.
  • Experiments in molecular optimization show that the method remains effective with very large datasets (over 20,000 observations), reinforcing its scalability.
  • Overall, the findings suggest researchers should reconsider prevailing intuitions and design principles for BO in high-dimensional regimes.

Abstract

Existing high-dimensional Bayesian optimization (BO) methods aim to overcome the curse of dimensionality by carefully encoding structural assumptions, from locality to sparsity to smoothness, into the optimization procedure. Surprisingly, we demonstrate that these approaches are outperformed by arguably the simplest method imaginable: Bayesian linear regression. After applying a geometric transformation to avoid boundary-seeking behavior, Gaussian processes with linear kernels match state-of-the-art performance on tasks with 60- to 6,000-dimensional search spaces. Linear models offer numerous advantages over their non-parametric counterparts: they afford closed-form sampling and their computation scales linearly with data, a fact we exploit on molecular optimization tasks with >20,000 observations. Coupled with empirical analyses, our results suggest the need to depart from past intuitions about BO methods in high-dimensions.