AI-pilled Arm CEO teases mystery products that will turn it into a money machine

The Register / 3/25/2026

💬 OpinionSignals & Early TrendsIndustry & Market Moves

Key Points

  • ArmのCEOが、同社を「マネーマシン」に変える可能性のある“謎の製品”群について、AIを軸にした方針を示唆した。
  • 記事では、ArmがIPライセンスの制約からの脱却(従来の収益モデルや契約形態の見直し)を進める流れが強調されている。
  • Armの次の収益機会や提供価値が、AI需要の伸びと連動して拡大するシナリオが示されている。
  • ただし具体的な製品内容・時期・技術詳細は明らかにされておらず、発表の続報待ちの状況となっている。

AI-pilled Arm CEO teases mystery products that will turn it into a money machine

Breaking free of its IP licensing shackles

Tue 24 Mar 2026 // 23:21 UTC

Arm CEO Rene Haas took an ice-cold sip of the AI Kool-Aid during a keynote speech at the company’s annual conference on Tuesday, teasing a future product that he thinks will pump the British chip designer's total addressable market (TAM) to $1 trillion by the end of the decade.

What are those products? That's a question for tomorrow. Tuesday's event was all about Arm’s newly announced AGI CPU products, which will free the company from the shackles of its IP licensing model by enabling the company to sell directly to end customers.

Haas has high hopes for agentic AI to accelerate the British chip designer's datacenter business. By the end of the decade, he predicts its datacenter silicon will catapult its datacenter TAM to more than $100 billion.

During his Tuesday keynote at the Arm Everywhere conference, the CEO said the company currently competes for a datacenter market worth about $3 billion a year in royalties.

"When we look at what's going on with agentic AI, the growth of CPUs; the benefit that power-efficient CPUs bring to the data center; we think this represents about $100 billion TAM for us in the future,” he said.

These figures are predicated in large part on the belief that agentic frameworks, like OpenClaw, will quadruple the demand for CPU cores.

While models powering tools like OpenClaw will continue to run on specialized accelerators, the agentic systems built atop them don't.

These agents run on CPU cores and need additional CPU compute and memory resources to execute the code generated by the models to automate tasks.

Because these agent interactions aren't necessarily tied to a single user's request – one agent may call other agents to complete a task – the volume of traffic these workloads will generate is expected to be rise significantly.

Arm already had a role to play here. Its instruction set architecture is used in CPUs like Amazon's Graviton.

To further reduce the barrier to entry to adopting its IP, Arm introduced compute subsystems in 2023 – essentially shake-n-bake processor blueprints containing all the ingredients necessary to create custom chips. Customers like Microsoft could tweak the recipe and send it off to their preferred fab to cook.

Yet few organizations have the expertise or resources possessed by Microsoft or other hyperscalers. Arm on Tuesday therefore unveiled its first datacenter silicon to bear the Arm brand. The company worked with Meta on the AGI CPU, and both built it to run agentic systems.

We took a closer look at the 136-core part earlier on Tuesday, but suffice to say Arm is going to need to ship a lot of them if it expects to be more than a minnow in $100 billion pond.

To Haas' credit, at launch Arm's AGI CPU has already secured big-name customers like Meta, OpenAI, SAP, Cloudflare, and SK Telecom, all of whom intend to deploy the chip when it arrives later this year.

However, few AI shops stick to one silicon supplier. As we reported earlier this year, Meta is also deploying large numbers of Nvidia's Grace CPUs to power its agentic systems, with plans to expand that footprint to include the GPU giant's new Vera CPUs as well. The social networking giant is also buying custom chips from Broadcom .That said, Arm still makes money on every chip that includes its licensed designs, so wins either way.

Then there are the blue and red elephants in the room, Intel and AMD, which benefit from more than two decades of continuity around their x86-64 architecture.

The CPU market has never been more competitive. However, Arm’s EVP of Cloud AI Mohamed Awad argues that the company’s AGI CPU is better suited to agentic tasks thanks to a streamlined core that foregoes extraneous functionality and doesn't rely on simultaneous multithreading, which he argues allows for more deterministic scaling.

Whether that design is actually an advantage is up for debate. For Vera, Nvidia opted for simultaneous multithreading (SMT) while Intel has already announced plans to bring hyperthreading back with its Coral Rapids Xeons after briefly abandoning the tech in its upcoming Diamond Rapids parts.

Meanwhile, AMD's latest Epyc processors, due out later this year, will offer up to 256 cores. Even with SMT turned off, that's still nearly twice the core count of Arm's new chip.

To stay competitive, Arm will be releasing new chips as early as next year with a third-gen AGI CPU already under development. ®

More about

TIP US OFF

Send us news