The Optimal Sample Complexity of Multiclass and List Learning

arXiv stat.ML / 4/28/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper addresses a long-standing gap in understanding multiclass sample complexity, identifying the DS dimension as the correct complexity parameter (analogous to VC dimension in binary classification).
  • It extends prior algebraic work by Hanneke et al. (2026) to show that the maximum hypergraph density of multiclass hypothesis classes is upper-bounded by their DS dimension.
  • This result settles a longstanding conjecture by Daniely and Shalev-Shwartz (2014) related to bounds on multiclass learning difficulty.
  • As a consequence, the authors derive the optimal dependence of sample complexity on the DS dimension for both multiclass classification and list learning.
  • Overall, the work tightens the theoretical bounds that previously differed by a factor of about \sqrt{DS} between upper and lower limits.

Abstract

While the optimal sample complexity of binary classification in terms of the VC dimension is well-established, determining the optimal sample complexity of multiclass classification has remained open. The appropriate complexity parameter for multiclass classification is the DS dimension, and despite significant efforts, a gap of \sqrt{\text{DS}} has persisted between the upper and lower bounds on sample complexity. Recent work by Hanneke et al. (2026) shows a novel algebraic characterization of multiclass hypothesis classes in terms of their DS dimension. Building up on this, we show that the maximum hypergraph density of any multiclass hypothesis class is upper-bounded by its DS dimension. This proves a longstanding conjecture of Daniely and Shalev-Shwartz (2014). As a consequence, we determine the optimal dependence of the sample complexity on the DS dimension for multiclass as well as list learning.