Anchored Variational Inference for Personalized Sequential Latent-State Models

arXiv stat.ML / 4/28/2026

💬 OpinionModels & Research

Key Points

  • The paper studies sequential latent-variable models that include subject-specific random effects, highlighting that while local latent inference is tractable, integrating over random effects is computationally expensive.
  • It proposes an anchored variational inference approach that approximates the local latent posterior by evaluating it at a single representative “anchor point” for each subject’s random effect to reduce computation.
  • The authors show that, under appropriate conditions, the anchor point derived from the posterior mean is nearly optimal, and the anchored variational EM (AVEM) algorithm retains the local monotonicity characteristics of standard variational inference.
  • They apply the framework to mixed hidden Markov models and mixed-effects state-space models, derive AVEM algorithms for these cases, and report simulation results showing accurate estimation with substantial computational savings.
  • The work also introduces a partially anchored variant that anchors only those components of the subject-specific latent effect whose posteriors are sufficiently concentrated.

Abstract

Sequential latent-variable models with subject-specific random effects provide a flexible framework for modeling temporally structured data with both local latent dynamics and stable between-subject heterogeneity. In such models, conditional inference for the local latent process is often tractable, but integrating over subject-specific random effects can be computationally demanding. We propose an anchored variational inference framework for efficient approximate inference in this setting. The central idea is to replace the full conditional posterior of the local latent process with its evaluation at a representative value of the subject-specific latent effect, called the anchor point, thereby preserving tractable local inference while substantially reducing computational cost. This approximation is especially appealing in sequential settings, where the posterior distribution of the random effect becomes increasingly concentrated as the sequence length grows. Under suitable conditions, we show that the posterior mean is a nearly optimal anchor point and that the resulting anchored variational EM (AVEM) algorithm approximately preserves the local monotonicity behavior of standard variational inference. We instantiate the framework in two representative classes of sequential latent-variable models, namely mixed hidden Markov models and mixed-effects state-space models, derive the corresponding AVEM algorithms, and use simulation studies to indicate that the resulting methods achieve accurate estimation with substantial computational gains. We also discuss a partially anchored variant of the framework, in which only the components of the subject-specific latent effect whose posteriors are well concentrated are anchored.