SRMU: Relevance-Gated Updates for Streaming Hyperdimensional Memories

arXiv cs.AI / 4/17/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper introduces SRMU (Sequential Relevance Memory Unit), a new update rule for vector symbolic architecture (VSA)-based sequential associative memories in streaming, non-stationary environments.
  • SRMU mitigates the problem of stale information that arises from naive additive updates by using temporal decay plus a relevance-gating mechanism to filter redundant, conflicting, and outdated data before storage.
  • Unlike methods that focus mainly on cleanup, SRMU is designed to be domain- and cleanup-agnostic, emphasizing better control over memory formation.
  • Experiments on streaming state-tracking benchmarks isolate issues like non-uniform sampling and non-stationary temporal dynamics, showing improved memory similarity (+12.6%) and significantly reduced cumulative memory magnitude (-53.5%).

Abstract

Sequential associative memories (SAMs) are difficult to build and maintain in real-world streaming environments, where observations arrive incrementally over time, have imbalanced sampling, and non-stationary temporal dynamics. Vector Symbolic Architectures (VSAs) provide a biologically-inspired framework for building SAMs. Entities and attributes are encoded as quasi-orthogonal hyperdimensional vectors and processed with well defined algebraic operations. Despite this rich framework, most VSA systems rely on simple additive updates, where repeated observations reinforce existing information even when no new information is introduced. In non-stationary environments, this leads to the persistence of stale information after the underlying system changes. In this work, we introduce the Sequential Relevance Memory Unit (SRMU), a domain- and cleanup-agnostic update rule for VSA-based SAMs. The SRMU combines temporal decay with a relevance gating mechanism. Unlike prior approaches that solely rely on cleanup, the SRMU regulates memory formation by filtering redundant, conflicting, and stale information before storage. We evaluate the SRMU on streaming state-tracking tasks that isolate non-uniform sampling and non-stationary temporal dynamics. Our results show that the SRMU increases memory similarity by 12.6\% and reduces cumulative memory magnitude by 53.5\%. This shows that the SRMU produces more stable memory growth and stronger alignment with the ground-truth state.