Anthropic reveals $30bn run rate and plans to use 3.5GW of new Google AI chips

The Register / 4/7/2026

📰 NewsDeveloper Stack & InfrastructureSignals & Early TrendsIndustry & Market Moves

Key Points

  • Anthropic disclosed it is operating at an estimated $30 billion annualized revenue run rate while scaling its compute footprint for AI workloads.
  • The company said it plans to deploy 3.5GW of new Google AI chips, signaling a major expansion in data-center demand and hardware partnerships.
  • The report frames Broadcom as benefiting from the silicon build-out that supports these AI chip deployments.
  • Despite the scale-up signals, Broadcom also views Anthropic as a continuing business risk, indicating ongoing uncertainty even with strong growth.
  • Overall, the announcement highlights how large AI model providers are locking in substantial next-generation chip capacity to sustain inference and training growth.

Anthropic reveals $30bn run rate and plans to use 3.5GW of new Google AI chips

Broadcom's building the silicon and is chuffed about that, but also notes Anthropic remains a risk

Tue 7 Apr 2026 // 01:09 UTC

Broadcom has announced that Google has asked it to build next-generation AI and datacenter networking chips, and that Anthropic plans to consume 3.5GW worth of the accelerators it delivers to the ads and search giant.

News of the two deals emerged today in a Broadcom regulatory filing that opens with two items of news.

One is a “Long Term Agreement for Broadcom to develop and supply custom Tensor Processing Units (“TPUs”) for Google’s future generations of TPUs.” Google and Broadcom have collaborated to produce custom TPUs. Broadcom CEO Hock Tan recently shared his opinion that hyperscalers don’t have the skill to create custom accelerators and predicted Broadcom’s chip business will therefore win over $100 billion of revenue from AI chips in 2027 alone.

Working on next-gen TPUs for Google will presumably help to make that prediction a reality.

So will the second part of Broadcom’s announcement: a “Supply Assurance Agreement for Broadcom to supply networking and other components to be used in Google’s next-generation AI racks through up to 2031.”

Broadcom’s filing also revealed one user of Google’s next-gen TPU will be Anthropic, which starting in 2027, “will access through Broadcom approximately 3.5 gigawatts as part of the multiple gigawatts of next generation TPU-based AI compute capacity committed by Anthropic.”

The filing includes the following notable statement:

The consumption of such expanded AI compute capacity by Anthropic is dependent on Anthropic’s continued commercial success. In connection with this deployment, the parties are in discussions with certain operational and financial partners.

That sounds an awful lot like Broadcom putting on the record that the financial arrangements that will make it possible to deploy 3.5GW worth of custom TPUs for Anthropic represent sufficient risk that the company needs to put it on the record in a regulatory filing.

In its announcement about the deal, Anthropic seemingly tries to reassure markets about its financial affairs by revealing that “Our run-rate revenue has now surpassed $30 billion—up from approximately $9 billion at the end of 2025.”

“When we announced our Series G fundraising in February, we shared that over 500 business customers were each spending over $1 million on an annualized basis,” Anthropic wrote. “Today that number exceeds 1,000, doubling in less than two months.”

Yet Broadcom still worries about the AI upstart.

Google’s take on the announcements points out that in addition to renting TPUs, Anthropic is a big Google Cloud customer.

Anthropic pointed out that it also uses AWS’s Trainium AI chips, plus Nvidia kit, so it can “match workloads to the chips best suited for them.” ®

More about

TIP US OFF

Send us news