CogFormer: Learn All Your Models Once

arXiv stat.ML / 3/24/2026

📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper introduces CogFormer, a meta-amortized, transformer-based framework designed to make simulation-based inference (SBI) reusable across many structurally similar cognitive models without retraining for each variant.
  • It targets a key limitation of traditional SBI amortization: when users change parameterizations, generative functions, priors, or design variables, they typically must retrain, reducing the amortization benefit.
  • CogFormer is reported to remain valid across combinatorially many model configurations, enabling changes in data types, parameters, design matrices, and sample sizes while preserving estimation capability.
  • The authors present quantitative results across multiple decision-making model families, covering binary, multi-alternative, and continuous response settings, and claim a minimal “amortization offset.”
  • Overall, the work positions CogFormer as a potential accelerator (“engine”) for iterating modeling assumptions and scaling cognitive modeling workflows more efficiently.

Abstract

Simulation-based inference (SBI) with neural networks has accelerated and transformed cognitive modeling workflows. SBI enables modelers to fit complex models that were previously difficult or impossible to estimate, while also allowing rapid estimation across large numbers of datasets. However, the utility of SBI for iterating over varying modeling assumptions remains limited: changing parameterizations, generative functions, priors, and design variables all necessitate model retraining and hence diminish the benefits of amortization. To address these issues, we pilot a meta-amortized framework for cognitive modeling which we nickname the CogFormer. Our framework trains a transformer-based architecture that remains valid across a combinatorial number of structurally similar models, allowing for changing data types, parameters, design matrices, and sample sizes. We present promising quantitative results across families of decision-making models for binary, multi-alternative, and continuous responses. Our evaluation suggests that CogFormer can accurately estimate parameters across model families with a minimal amortization offset, making it a potentially powerful engine that catalyzes cognitive modeling workflows.