Discrete Prototypical Memories for Federated Time Series Foundation Models

arXiv cs.LG / 4/7/2026

💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper proposes FeDPM, a federated learning framework that turns time-series modeling into a discrete latent-space problem using prototypical memories.
  • It targets two key gaps in existing federated time-series foundation models: semantic mismatch between text-centric LLM latent spaces and time-series data, and the tendency of parameter-sharing FL to force heterogeneous domains into an overly continuous shared space.
  • FeDPM learns domain-local prototypical memory priors, then aligns cross-domain memories to obtain a unified discrete representation across regimes.
  • It adds a domain-specific memory update mechanism to balance shared knowledge with personalized, domain-specific prototypical information.
  • The authors report that extensive experiments validate the efficiency and effectiveness of FeDPM and provide publicly available code for replication.

Abstract

Leveraging Large Language Models (LLMs) as federated learning (FL)-based time series foundation models offers a promising way to transfer the generalization capabilities of LLMs to time series data while preserving access to private data. However, the semantic misalignment between time-series data and the text-centric latent space of existing LLMs often leads to degraded performance. Meanwhile, the parameter-sharing mechanism in existing FL methods model heterogeneous cross-domain time-series data into a unified continuous latent space, which contradicts the fact that time-series semantics frequently manifest as discrete and recurring regimes. To address these limitations, we propose \textsc{FeDPM}, a federated framework for time-series foundation models based on discrete prototypical memories. Specifically, we learn local prototypical memory priors for intra-domain time-series data. We then align cross-domain memories to promote a unified discrete latent space and introduce a domain-specific memory update mechanism to balance shared and personalized prototypical knowledge. Extensive experiments demonstrate the efficiency and effectiveness of \textsc{FeDPM}. The code is publicly available at https://anonymous.4open.science/r/FedUnit-64D1.