Mitigating Premature Discretization with Progressive Quantization for Robust Vector Tokenization

arXiv cs.LG / 3/25/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper identifies a key weakness in existing vector quantization (VQ) approaches for multimodal tokenization: “Premature Discretization,” where discrete quantization is applied before the encoder has learned the data manifold.
  • It introduces Progressive Quantization (ProVQ), treating quantization hardness as a training curriculum that gradually anneals from continuous latents to discrete tokens.
  • Experiments show ProVQ improves reconstruction and generative performance on ImageNet-1K and ImageNet-100, indicating benefits for image generative modeling.
  • The method also performs strongly on complex biological sequence modeling, setting a new state-of-the-art performance ceiling for protein structure tokenization on StrutTokenBench.

Abstract

Vector Quantization (VQ) has become the cornerstone of tokenization for many multimodal Large Language Models and diffusion synthesis. However, existing VQ paradigms suffer from a fundamental conflict: they enforce discretization before the encoder has captured the underlying data manifold. We term this phenomenon Premature Discretization. To resolve this, we propose Progressive Quantization (ProVQ), which incorporates the dynamics of quantization hardness as a fundamental yet previously overlooked axis in VQ training. By treating quantization as a curriculum that smoothly anneals from a continuous latent space to a discrete one, ProVQ effectively guides the codebook toward the well-expanded manifolds. Extensive experimental results demonstrate the broad effectiveness of ProVQ across diverse modalities. We report improved reconstruction and generative performance on the ImageNet-1K and ImageNet-100 benchmarks, highlighting the ProVQ's boost for generative modeling. Furthermore, ProVQ proves highly effective for modeling complex biological sequences, establishing a new performance ceiling for protein structure tokenization on the StrutTokenBench leaderboard.