AI Navigate

GGMPs: Generalized Gaussian Mixture Processes

arXiv cs.LG / 3/12/2026

📰 NewsModels & Research

Key Points

  • The paper introduces the Generalized Gaussian Mixture Process (GGMP), a Gaussian process-based method for multimodal conditional density estimation where outputs can be complex distributions rather than single scalars.
  • GGMP combines local Gaussian mixture fitting, cross-input component alignment, and per-component heteroscedastic GP training to produce a closed-form Gaussian mixture predictive density.
  • The approach is designed to be tractable, compatible with standard GP solvers, and scalable, avoiding the exponential latent-assignment complexity of naive multimodal GP formulations.
  • Empirically, GGMPs improve distributional approximation on both synthetic and real-world datasets exhibiting pronounced non-Gaussianity and multimodality.

Abstract

Conditional density estimation is complicated by multimodality, heteroscedasticity, and strong non-Gaussianity. Gaussian processes (GPs) provide a principled nonparametric framework with calibrated uncertainty, but standard GP regression is limited by its unimodal Gaussian predictive form. We introduce the Generalized Gaussian Mixture Process (GGMP), a GP-based method for multimodal conditional density estimation in settings where each input may be associated with a complex output distribution rather than a single scalar response. GGMP combines local Gaussian mixture fitting, cross-input component alignment and per-component heteroscedastic GP training to produce a closed-form Gaussian mixture predictive density. The method is tractable, compatible with standard GP solvers and scalable methods, and avoids the exponentially large latent-assignment structure of naive multimodal GP formulations. Empirically, GGMPs improve distributional approximation on synthetic and real-world datasets with pronounced non-Gaussianity and multimodality.