Chameleons do not Forget: Prompt-Based Online Continual Learning for Next Activity Prediction
arXiv cs.LG / 4/2/2026
📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- The paper targets predictive process monitoring (PPM), specifically next activity prediction, where dynamic environments and concept drift can cause catastrophic forgetting in conventional static training setups.
- It introduces CNAPwP (Continual Next Activity Prediction with Prompts), adapting the DualPrompt continual learning approach to maintain accuracy and adaptability while mitigating forgetting.
- The authors create or introduce datasets featuring recurring concept drifts and propose a task-specific forgetting metric that quantifies the accuracy gap between initial and later occurrences of the same task concept.
- Experiments across three synthetic and two real-world datasets with recurrent drift setups show CNAPwP achieves state-of-the-art or competitive performance versus five baseline methods.
- An open-source implementation plus datasets and results are released publicly, supporting reuse and further evaluation by the research community.
Related Articles

Black Hat Asia
AI Business

Unitree's IPO
ChinaTalk

Did you know your GIGABYTE laptop has a built-in AI coding assistant? Meet GiMATE Coder 🤖
Dev.to

Benchmarking Batch Deep Reinforcement Learning Algorithms
Dev.to
A bug in Bun may have been the root cause of the Claude Code source code leak.
Reddit r/LocalLLaMA