From Topic to Transition Structure: Unsupervised Concept Discovery at Corpus Scale via Predictive Associative Memory
arXiv cs.AI / 3/20/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The authors extend Predictive Associative Memory (PAM) to extract transition-structure concepts from 373 million co-occurrence pairs across 9,766 Project Gutenberg texts.
- The model, a 29.4M-parameter contrastive network, maps passages into an association space where clustering reveals function, register, and literary tradition rather than mere topical similarity.
- Clustering across six granularities (k=50 to 2,000) yields a multi-resolution concept map with broad modes and precise registers, such as "direct confrontation" or "courtroom cross-examination."
- Unseen novels can be assigned to existing clusters without retraining, whereas raw embeddings tend to saturate clusters, showing stronger generalization in the association space.
- The work contrasts association-space clustering with embedding-based topic clustering and extends PAM from episodic recall to higher-level concept formation under compression.
Related Articles

Check out this article on AI-Driven Reporting 2.0: From Manual Bottlenecks to Real-Time Decision Intelligence (2026 Edition)
Dev.to

SYNCAI
Dev.to
How AI-Powered Decision Making is Reshaping Enterprise Strategy in 2024
Dev.to
When AI Grows Up: Identity, Memory, and What Persists Across Versions
Dev.to
AI-Driven Reporting 2.0: From Manual Bottlenecks to Real-Time Decision Intelligence (2026 Edition)
Dev.to