CobwebTM: Probabilistic Concept Formation for Lifelong and Hierarchical Topic Modeling

arXiv cs.CL / 4/17/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • CobwebTM is a new low-parameter, lifelong hierarchical topic modeling approach designed to work with streaming text without assuming a fixed number of topics.
  • It adapts the Cobweb algorithm to continuous document embeddings to build semantic topic hierarchies online through incremental probabilistic concept formation.
  • The method targets common weaknesses of neural topic models—such as heavy tuning needs and catastrophic forgetting—while addressing limitations of classical probabilistic models for evolving data.
  • Experiments on multiple datasets show strong topic coherence, stable topics over time, and high-quality hierarchical structures.
  • The results suggest that combining incremental symbolic concept formation with pretrained representations can be an efficient and practical strategy for adaptive topic modeling.

Abstract

Topic modeling seeks to uncover latent semantic structure in text corpora with minimal supervision. Neural approaches achieve strong performance but require extensive tuning and struggle with lifelong learning due to catastrophic forgetting and fixed capacity, while classical probabilistic models lack flexibility and adaptability to streaming data. We introduce \textsc{CobwebTM}, a low-parameter lifelong hierarchical topic model based on incremental probabilistic concept formation. By adapting the Cobweb algorithm to continuous document embeddings, \textsc{CobwebTM} constructs semantic hierarchies online, enabling unsupervised topic discovery, dynamic topic creation, and hierarchical organization without predefining the number of topics. Across diverse datasets, \textsc{CobwebTM} achieves strong topic coherence, stable topics over time, and high-quality hierarchies, demonstrating that incremental symbolic concept formation combined with pretrained representations is an efficient approach to topic modeling.