Emergence Transformer: Dynamical Temporal Attention Matters

arXiv cs.AI / 4/23/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper proposes an “Emergence Transformer” that introduces dynamical temporal attention (DTA) with time-varying query, key, and value matrices to study and control emergence in temporal systems.
  • It shows that using neighbor-based DTA tends to consistently promote oscillatory coherence, while self-DTA can achieve coherence enhancement at an optimal attention weight due to a non-monotonic relationship with network structure.
  • The authors demonstrate that DTA can reshape social coherence in practical settings, offering strategies to either increase agreement or maintain plurality.
  • Applying DTA to Hopfield neural networks enables emergent continual learning while avoiding catastrophic forgetting.
  • Overall, the work frames DTA as a general mechanism for modulating emergent phenomena in networked dynamical systems via temporal attention alone.

Abstract

The Transformer, a breakthrough architecture in artificial intelligence, owes its success to the attention mechanism, which utilizes long-range interactions in sequential data, enabling the emergent coherence between large language models (LLMs) and data distributions. However, temporal attention, that is, different forms of long-range interactions in temporal sequences, has rarely been explored in emergence phenomenon of complex systems including oscillatory coherence in quantum, biophysical, or climate systems. Here, by designing dynamical temporal attention (DTA) with time-varying query, key, and value matrices, we propose an Emergence Transformer. This architecture allows each component to interact with its own or its neighbors' past states through dynamical attention kernels, thereby enabling the promotion and/or suppression of the emergent coherence of components. Interestingly, we uncover that neighbor-DTA consistently promotes oscillatory coherence, whereas self-DTA exhibits an optimal attention weight for coherence enhancement, owing to its non-monotonic dependence on network structure. Practically, we demonstrate how DTA reshapes social coherence, suggesting strategies to either enhance agreement or preserve plurality. We further apply DTA to the paradigmatic Hopfield neural network, achieving emergent continual learning without catastrophic forgetting. Together, these results lay a foundation and provide an immediate paradigm for modulating emergence phenomenon in networked dynamics only using DTA.