CogFormer: Learn All Your Models Once
arXiv stat.ML / 3/24/2026
📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces CogFormer, a meta-amortized, transformer-based framework designed to make simulation-based inference (SBI) reusable across many structurally similar cognitive models without retraining for each variant.
- It targets a key limitation of traditional SBI amortization: when users change parameterizations, generative functions, priors, or design variables, they typically must retrain, reducing the amortization benefit.
- CogFormer is reported to remain valid across combinatorially many model configurations, enabling changes in data types, parameters, design matrices, and sample sizes while preserving estimation capability.
- The authors present quantitative results across multiple decision-making model families, covering binary, multi-alternative, and continuous response settings, and claim a minimal “amortization offset.”
- Overall, the work positions CogFormer as a potential accelerator (“engine”) for iterating modeling assumptions and scaling cognitive modeling workflows more efficiently.
Related Articles
The Security Gap in MCP Tool Servers (And What I Built to Fix It)
Dev.to
Big Tech firms are accelerating AI investments and integration, while regulators and companies focus on safety and responsible adoption.
Dev.to
I made a new programming language to get better coding with less tokens.
Dev.to
RSA Conference 2026: The Week Vibe Coding Security Became Impossible to Ignore
Dev.to

Adversarial AI framework reveals mechanisms behind impaired consciousness and a potential therapy
Reddit r/artificial