A Unified Framework for Entropy Search and Expected Improvement in Bayesian Optimization

Nuojin Cheng, Leonard Papenmeier, Stephen Becker, Luigi Nardi

International Conference on Machine Learning 2025 · Oral

This talk introduces a groundbreaking unified framework, **Variational Entropy Search (VES)**, that bridges the conceptual and practical gap between two of the most prominent acquisition functions in Bayesian Optimization (BO): **Expected Improvement (EI)** and **Max Value Entropy Search (MES)**. Presented by Nuojin Cheng and Leonard Papenmeier, with co-authors Stephen Becker and Luigi Nardi, the work reinterprets EI not merely as an intuitive heuristic focused on immediate gains, but as a specific variational approximation of MES, an information-theoretic approach aimed at reducing uncertainty about the function's global maximum.

AI review

VES is a competent theoretical contribution to Bayesian optimization that establishes a variational bridge between EI and MES, with the core result being that EI emerges as a degenerate case of a broader variational entropy search framework under an exponential distributional assumption. The theorem is stated precisely and the derivation appears sound within its noiseless setting. VES-Gamma, the practical derivative, shows real empirical gains in short-length-scale regimes. However, the noiseless restriction substantially limits the result's generality, the variational inference machinery…