Watts-per-Intelligence Part II: Algorithmic Catalysis
arXiv cs.AI / 4/25/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper proposes a thermodynamic framework for “algorithmic catalysis” within the watts-per-intelligence model, focusing on reusable computational structures that cut down irreversible operations for a task class.
- It shows that class-specific speed-ups are theoretically limited by the algorithmic mutual information between the computational substrate and the class descriptor, linking performance to information content.
- The authors argue that storing/using this class information requires a minimum thermodynamic cost, grounded in Landauer’s principle for erasing information.
- By combining these results, the paper derives a “coupling theorem” that sets a lower bound on how long a catalyst must be used (deployment horizon) before the energy cost is worthwhile.
- An example using an affine SAT problem class is provided to illustrate the framework and to connect learned systems to information–thermodynamic constraints on intelligent computation.
Related Articles
Navigating WooCommerce AI Integrations: Lessons for Agencies & Developers from a Bluehost Conflict
Dev.to

One Day in Shenzhen, Seen Through an AI's Eyes
Dev.to

Underwhelming or underrated? DeepSeek V4 shows “impressive” gains
SCMP Tech

Claude Code: Hooks, Subagents, and Skills — Complete Guide
Dev.to

Finding the Gold: An AI Framework for Highlight Detection
Dev.to