Optimal Lower Bounds for Online Multicalibration

arXiv stat.ML / 4/27/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper derives tight, information-theoretic lower bounds for online multicalibration and shows a clear separation from marginal calibration.
  • In the most general setting where group functions can depend on both the context and the learner’s predictions, it proves an Ω(T^(2/3)) lower bound on the expected multicalibration error using only three disjoint binary groups.
  • The established Ω(T^(2/3)) bound matches known upper bounds for multicalibration up to logarithmic factors, while also exceeding the best known O(T^(2/3−ε)) upper bound for marginal calibration, proving the problems are quantitatively different.
  • For the harder setting where group functions depend on context but not on the learner’s predictions, the authors prove a (tilde)Ω(T^(2/3)) lower bound by constructing a group family from an orthonormal basis and size O(log^3 T), again aligning with upper bounds up to logarithmic factors.

Abstract

We prove tight lower bounds for online multicalibration, establishing an information-theoretic separation from marginal calibration. In the general setting where group functions can depend on both context and the learner's predictions, we prove an \Omega(T^{2/3}) lower bound on expected multicalibration error using just three disjoint binary groups. This matches the upper bounds of Noarov et al. (2025) up to logarithmic factors and exceeds the O(T^{2/3-\varepsilon}) upper bound for marginal calibration (Dagan et al., 2025), thereby separating the two problems. We then turn to lower bounds for the more difficult case of group functions that may depend on context but not on the learner's predictions. In this case, we establish an \widetilde{\Omega}(T^{2/3}) lower bound for online multicalibration via an O(\log^3 T)-sized group family constructed from an orthonormal basis, again matching upper bounds up to logarithmic factors.