SpinGQE: A Generative Quantum Eigensolver for Spin Hamiltonians

arXiv cs.CL / 3/26/2026

💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper introduces SpinGQE, a generative Quantum Eigensolver that extends the GQE framework to spin Hamiltonians to address key VQE limitations like barren plateaus and limited ansatz expressivity.
  • SpinGQE treats quantum circuit design as a generative modeling problem, using a transformer-based decoder to learn distributions over circuits that produce low-energy states.
  • Training uses a weighted mean-squared error loss aligning model logits with circuit energies computed for each gate subsequence, enabling guidance from energy evaluations during sequence generation.
  • On the four-qubit Heisenberg model, the method is reported to converge near ground states, and hyperparameter searches suggest smaller transformer models, longer gate sequences, and well-chosen operator pools improve convergence reliability.
  • The authors argue generative approaches can explore complex energy landscapes without relying on problem-specific symmetries or structure and provide an open-source implementation.

Abstract

The ground state search problem is central to quantum computing, with applications spanning quantum chemistry, condensed matter physics, and optimization. The Variational Quantum Eigensolver (VQE) has shown promise for small systems but faces significant limitations. These include barren plateaus, restricted ansatz expressivity, and reliance on domain-specific structure. We present SpinGQE, an extension of the Generative Quantum Eigensolver (GQE) framework to spin Hamiltonians. Our approach reframes circuit design as a generative modeling task. We employ a transformer-based decoder to learn distributions over quantum circuits that produce low-energy states. Training is guided by a weighted mean-squared error loss between model logits and circuit energies evaluated at each gate subsequence. We validate our method on the four-qubit Heisenberg model, demonstrating successfulconvergencetonear-groundstates. Throughsystematichyperparameterexploration, we identify optimal configurations: smaller model architectures (12 layers, 8 attention heads), longer sequence lengths (12 gates), and carefully chosen operator pools yield the most reliable convergence. Our results show that generative approaches can effectively navigate complex energy landscapes without relying on problem-specific symmetries or structure. This provides a scalable alternative to traditional variational methods for general quantum systems. An open-source implementation is available at https://github.com/Mindbeam-AI/SpinGQE.

SpinGQE: A Generative Quantum Eigensolver for Spin Hamiltonians | AI Navigate