Slithering Through Gaps: Capturing Discrete Isolated Modes via Logistic Bridging

arXiv cs.LG / 4/14/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper identifies a core sampling problem in high-dimensional discrete distributions: inherent discontinuities create disconnected or rugged energy landscapes that cause gradient-based samplers to get stuck in local modes.
  • It proposes HiSS (Hyperbolic Secant-squared Gibbs-Sampling), a new sampling algorithm family that uses a Metropolis-within-Gibbs scheme to improve mixing across distant, isolated modes.
  • HiSS employs a logistic convolution kernel to couple the discrete variable with a continuous auxiliary variable so the auxiliary can represent the target distribution while enabling smoother mode transitions.
  • The authors provide convergence guarantees and report empirical results showing HiSS outperforms many existing approaches across Ising models, binary neural networks, and combinatorial optimization tasks.

Abstract

High-dimensional and complex discrete distributions often exhibit multimodal behavior due to inherent discontinuities, posing significant challenges for sampling. Gradient-based discrete samplers, while effective, frequently become trapped in local modes when confronted with rugged or disconnected energy landscapes. This limits their ability to achieve adequate mixing and convergence in high-dimensional multimodal discrete spaces. To address these challenges, we propose \emph{Hyperbolic Secant-squared Gibbs-Sampling (HiSS)}, a novel family of sampling algorithms that integrates a \emph{Metropolis-within-Gibbs} framework to enhance mixing efficiency. HiSS leverages a logistic convolution kernel to couple the discrete sampling variable with the continuous auxiliary variable in a joint distribution. This design allows the auxiliary variable to encapsulate the true target distribution while facilitating easy transitions between distant and disconnected modes. We provide theoretical guarantees of convergence and demonstrate empirically that HiSS outperforms many popular alternatives on a wide variety of tasks, including Ising models, binary neural networks, and combinatorial optimization.