How to Approximate Inference with Subtractive Mixture Models

arXiv cs.LG / 4/21/2026

📰 NewsDeveloper Stack & InfrastructureIdeas & Deep AnalysisModels & Research

Key Points

  • Subtractive Mixture Models (SMMs) extend classical mixture models by allowing negative coefficients, offering potentially greater expressiveness for approximate inference.
  • The paper tackles a key obstacle: SMMs lack latent-variable semantics, which prevents directly reusing the sampling methods commonly used for classical mixture models in variational inference (VI) and importance sampling (IS).
  • It proposes multiple expectation estimators for IS and learning schemes for VI that enable practical use of SMMs despite the missing semantics.
  • The authors empirically evaluate these methods for distribution approximation and highlight added issues related to estimation stability and learning efficiency.
  • They also discuss strategies to mitigate those challenges and provide code via the referenced GitHub repository.

Abstract

Classical mixture models (MMs) are widely used tractable proposals for approximate inference settings such as variational inference (VI) and importance sampling (IS). Recently, mixture models with negative coefficients, called subtractive mixture models (SMMs), have been proposed as a potentially more expressive alternative. However, how to effectively use SMMs for VI and IS is still an open question as they do not provide latent variable semantics and therefore cannot use sampling schemes for classical MMs. In this work, we study how to circumvent this issue by designing several expectation estimators for IS and learning schemes for VI with SMMs, and we empirically evaluate them for distribution approximation. Finally, we discuss the additional challenges in estimation stability and learning efficiency that they carry and propose ways to overcome them. Code is available at: https://github.com/april-tools/delta-vi.