Promoting Simple Agents: Ensemble Methods for Event-Log Prediction
arXiv cs.LG / 4/24/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper compares simple automata-based n-gram models with neural sequence models (LSTM and Transformers) for next-activity prediction in streaming event logs.
- Experiments on both synthetic patterns and five process-mining datasets find that well-configured n-grams can reach accuracy comparable to neural models while using far fewer computational resources.
- It reports that windowed neural architectures can produce unstable performance, whereas n-grams deliver more stable and consistent accuracy.
- Classical ensemble approaches (e.g., voting across many n-grams) improve n-gram accuracy but increase memory use and inference latency due to parallel agent execution.
- The authors introduce a “promotion” ensemble algorithm that dynamically selects between two active models during inference, achieving similar-or-better accuracy than non-windowed neural models with reduced computational cost.
Related Articles

Your MCP server probably has too many tools
Dev.to

MCP Auth That Actually Works: OAuth for Remote Servers
Dev.to

GoDavaii's Day 5: When 22 Indian Languages Redefine 'Hard' in Health AI
Dev.to

Gemma 4 and Qwen 3.6 with q8_0 and q4_0 KV cache: KL divergence results
Reddit r/LocalLLaMA
Corea arresta a hombre por imagen IA falsa del lobo Neukgu: hasta 5 años
Dev.to