Federated Learning with Quantum Enhanced LSTM for Applications in High Energy Physics

arXiv cs.LG / 4/20/2026

📰 NewsDeveloper Stack & InfrastructureIdeas & Deep AnalysisModels & Research

Key Points

  • The paper proposes a federated learning framework for high-energy physics (HEP) that uses a hybrid quantum-classical LSTM (QLSTM) to train across distributed nodes while preserving robustness and accuracy.
  • The QLSTM model is designed to leverage quantum models to capture complex relationships in the feature space while using an LSTM component to learn correlations across data points.
  • To address the limited compute and high cost of today’s noisy-intermediate quantum (NISQ) devices, the approach shifts quantum learning workloads to local servers via federated learning rather than relying on stand-alone quantum runs.
  • Experiments on a 5M-row Supersymmetry (SUSY) classification task show the method outperforms several existing variational quantum circuit (VQC)-based QML approaches and is roughly comparable to classical deep-learning benchmarks (within about ±1%).
  • The authors report that the framework uses fewer than 300 parameters and can reach comparable performance with only about 20K data points, claiming around a 100× improvement over the compared baselines in data/resource efficiency.

Abstract

Learning with large-scale datasets and information-critical applications, such as in High Energy Physics (HEP), demands highly complex, large-scale models that are both robust and accurate. To tackle this issue and cater to the learning requirements, we envision using a federated learning framework with a quantum-enhanced model. Specifically, we design a hybrid quantum-classical long-shot-term-memory model (QLSTM) for local training at distributed nodes. It combines the representative power of quantum models in understanding complex relationships within the feature space, and an LSTM-based model to learn necessary correlations across data points. Given the computing limitations and unprecedented cost of current stand-alone noisy-intermediate quantum (NISQ) devices, we propose to use a federated learning setup, where the learning load can be distributed to local servers as per design and data availability. We demonstrate the benefits of such a design on a classification task for the Supersymmetry(SUSY) dataset, having 5M rows. Our experiments indicate that the performance of this design is not only better that some of the existing work using variational quantum circuit (VQC) based quantum machine learning (QML) techniques, but is also comparable (\Delta \sim \pm 1\%) to that of classical deep-learning benchmarks. An important observation from this study is that the designed framework has <300 parameters and only needs 20K data points to give a comparable performance. Which also turns out to be a 100\times improvement than the compared baseline models. This shows an improved learning capability of the proposed framework with minimal data and resource requirements, due to the joint model with an LSTM based architecture and a quantum enhanced VQC.