NeuroGame Transformer: Gibbs-Inspired Attention Driven by Game Theory and Statistical Physics
arXiv cs.AI / 3/20/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The NeuroGame Transformer introduces a dual perspective on attention by treating tokens as players in a cooperative game and as interacting spins in a Gibbs-based physical system.
- It uses Shapley values for global attribution and Banzhaf indices for local influence, combined via a learnable gate to form an external magnetic field that modulates attention.
- Pairwise interactions are captured by an Ising-like energy with attention weights emerging as marginal probabilities under a Gibbs distribution, computed efficiently via mean-field equations.
- To scale to long sequences, the method employs importance-weighted Monte Carlo estimators with Gibbs-distributed weights and provides theoretical convergence and a fairness-sensitivity trade-off controlled by an interpolation parameter.
- Experimental results on SNLI and MNLI-matched show strong performance, surpassing ALBERT-Base and remaining highly competitive with RoBERTa-Base, with code released on GitHub.
Related Articles

Attacks On Data Centers, Qwen3.5 In All Sizes, DeepSeek’s Huawei Play, Apple’s Multimodal Tokenizer
The Batch

Your AI generated code is "almost right", and that is actually WORSE than it being "wrong".
Dev.to

Lessons from Academic Plagiarism Tools for SaaS Product Development
Dev.to

**Core Allocation Optimization for Energy‑Efficient Multi‑Core Scheduling in ARINC650 Systems**
Dev.to

KI in der amtlichen Recherche beim DPMA: Was Patentanwälte bei Neuanmeldungen jetzt beachten sollten (Stand: März 2026)
Dev.to