Gromov-Wasserstein Methods for Multi-View Relational Embedding and Clustering

arXiv cs.LG / 4/28/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper addresses the challenge of learning shared low-dimensional representations from multi-view relational data when each view has different underlying geometry.
  • It introduces Bary-GWMDS, a Gromov-Wasserstein-distance-matrix method that learns a consensus embedding by preserving relational structure common across views.
  • The approach uses intrinsic distances to better tolerate nonlinear distortions between views, improving the geometric faithfulness of the resulting embedding.
  • The authors also propose Mean-GWMDS-C, a clustering-focused variant that averages distance matrices and learns reduced-support representations via a consensus Gromov-Wasserstein transport.
  • Experiments on both synthetic and real-world datasets indicate the method produces stable, geometrically meaningful embeddings and supports clustering objectives.

Abstract

Learning low-dimensional representations from multi-view relational data is challenging when underlying geometries differ across views. We propose Bary-GWMDS, a Gromov-Wasserstein-based method that operates directly on distance matrices to learn a consensus embedding preserving shared relational structure. By leveraging intrinsic distances, the approach naturally handles nonlinear distortions across views. We also introduce Mean-GWMDS-C, a clustering-oriented formulation that averages distance matrices and learns reduced-support representations via a consensus Gromov-Wasserstein transport. Experiments on synthetic and real-world datasets show that the proposed framework yields stable and geometrically meaningful embeddings.