GGMPs: Generalized Gaussian Mixture Processes
arXiv cs.LG / 3/12/2026
📰 NewsModels & Research
Key Points
- The paper introduces the Generalized Gaussian Mixture Process (GGMP), a Gaussian process-based method for multimodal conditional density estimation where outputs can be complex distributions rather than single scalars.
- GGMP combines local Gaussian mixture fitting, cross-input component alignment, and per-component heteroscedastic GP training to produce a closed-form Gaussian mixture predictive density.
- The approach is designed to be tractable, compatible with standard GP solvers, and scalable, avoiding the exponential latent-assignment complexity of naive multimodal GP formulations.
- Empirically, GGMPs improve distributional approximation on both synthetic and real-world datasets exhibiting pronounced non-Gaussianity and multimodality.


![[Boost]](/_next/image?url=https%3A%2F%2Fmedia2.dev.to%2Fdynamic%2Fimage%2Fwidth%3D800%252Cheight%3D%252Cfit%3Dscale-down%252Cgravity%3Dauto%252Cformat%3Dauto%2Fhttps%253A%252F%252Fdev-to-uploads.s3.amazonaws.com%252Fuploads%252Fuser%252Fprofile_image%252F3833034%252F44fa15e0-8eb9-4843-a424-a4a7b3538f43.jpeg&w=3840&q=75)