AI Navigate

Efficient Generative Modeling with Unitary Matrix Product States Using Riemannian Optimization

arXiv cs.LG / 3/13/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper studies matrix product states (MPS) for generative modeling and shows unitary MPS improves unsupervised learning by reducing ambiguity in parameter updates and maintaining efficiency.
  • It introduces a Riemannian optimization framework that handles probabilistic modeling with manifold constraints and derives a space-decoupling algorithm for efficient training.
  • Experiments on Bars-and-Stripes and EMNIST demonstrate fast adaptation to data structure, stable updates, and strong performance while preserving the expressive power of MPS.
  • The work presents tensor-network based generative modeling as a promising approach for high-dimensional distribution learning with physical interpretability.

Abstract

Tensor networks, which are originally developed for characterizing complex quantum many-body systems, have recently emerged as a powerful framework for capturing high-dimensional probability distributions with strong physical interpretability. This paper systematically studies matrix product states (MPS) for generative modeling and shows that unitary MPS, which is a tensor-network architecture that is both simple and expressive, offers clear benefits for unsupervised learning by reducing ambiguity in parameter updates and improving efficiency. To overcome the inefficiency of standard gradient-based MPS training, we develop a Riemannian optimization approach that casts probabilistic modeling as an optimization problem with manifold constraints, and further derive an efficient space-decoupling algorithm. Experiments on Bars-and-Stripes and EMNIST datasets demonstrate fast adaptation to data structure, stable updates, and strong performance while maintaining the efficiency and expressive power of MPS.