Google explains why its all-in-one AI stack embraces competitors

The Register / 4/24/2026

💬 OpinionDeveloper Stack & InfrastructureIdeas & Deep AnalysisIndustry & Market Moves

Key Points

  • Google says its “all-in-one” AI stack is designed to be differentiated while still remaining open to competing tools and ecosystems.
  • The company’s rationale focuses on letting customers choose best-in-class components rather than locking them into a single vendor’s entire workflow.
  • By adopting an open approach, Google aims to reduce friction for enterprises integrating AI into existing infrastructure.
  • The positioning suggests Google wants to compete on performance and integration quality while tolerating interoperability with alternatives.

Google explains why its all-in-one AI stack embraces competitors

'Differentiated, but open'

Thu 23 Apr 2026 // 17:13 UTC

Google Cloud Next Google Cloud’s Andi Gutmans said that the company holds a structural advantage over its largest rivals in the race to win value from AI agents in the enterprise, arguing that no competitor currently combines cloud computing infrastructure, frontier AI models, and a data platform under one roof.

“We’re really the only provider that has the AI infrastructure, the model and the data platform,” he said in response to a question from The Register during a briefing with reporters on the sidelines of Google Cloud Next.

Gutmans, who runs Google Cloud's data business, including its analytics, transactional databases, storage and business intelligence products, said the integrated stack is critical to achieving value from AI.

"If you think about AWS and Azure, they've got the infrastructure, they don't have the model," Gutmans said. "You look at the data providers, they have the data platform, but they've got to get the infrastructure and model from others. The AI model providers just do the AI model."

Gutmans said as enterprises shift from AI tools that respond to human queries toward agents that act autonomously on behalf of employees, the significance of those gaps becomes more pronounced. That transition puts pressure on the underlying data platform in ways that earlier architectures were not designed to handle, and that the economics of running agents at scale rewards providers that control more of the stack.

“If you ask ‘How is this agentic data cloud really different because everyone is saying the same thing?’ The answer is we are uniquely positioned to integrate these things very tightly which is now more important than ever as you go from human scale to agent scale because you're going to have to bend the price-performance curve or it's going to be too expensive.”

Gutman said Google spent the past year and a half rethinking its data platform for the shift to agent scale. He said roughly 90 percent of enterprise data remains unstructured and has historically gone unused. He said the Knowledge Catalog announced at the show is designed to make that data available to agents without requiring armies of data engineers to prepare it manually.

The moment that made the change possible was not a product decision but a model one. He said that, when Gemini 2.5 arrived, there was a tipping point in reasoning capability that forced Google to re-engineer every agent in its data portfolio.

“We’ve completely re-engineered every single one of our agents in the last year. So even the conversation analytics agent, the data science agent, the data engineering agent — we've had to be less prescriptive with the models. That’s where the Knowledge Catalog and the MCPs help because they’re so much better than reasoning around them. That is the big tipping point." he said. "If you ask a customer how conversation analytics was last year versus now, they'll tell you they couldn't use it last year. It worked for simple stuff."

He said the company has roughly 80 data-related announcements at the conference this week, and that nearly every agent product in his portfolio has been rebuilt in the past year.

"The models have gone so far," he said. "It's night and day."

He said approaches that required months of manual ontology-building are no longer necessary.

"A year ago, people would be like, 'Let me get Palantir and get 20 people and work for six months and build an ontology.' That's not how you would approach it anymore," he said. "If you really want to activate your whole data estate you can’t do it with people."

The Register asked Gutmans how Google navigates a market where it simultaneously competes with, and also partners with many of the software providers

Google makes its own TPU AI accelerators, but partners with Nvidia on chips. It has a data analytics platform in Big Query but also works with Databricks, Snowflake, and Informatica. GCP users can create, deploy and govern AI agents to carry out tasks across their digital estates, but it can also host those same capabilities from its partners at Salesforce and ServiceNow.

“Our view, and I don’t think its different than any other hyperscaler, is we want to build the best platform,” he said.

Gutmans said that the integrated stack is a real and durable competitive advantage, particularly as security, governance and cost efficiency become harder to manage across fragmented systems. He said the same principle applies to the cross-cloud lakehouse Google announced this week, which he said allows customers to query data sitting in Amazon Web Services or Microsoft Azure with low latency.

"Differentiated, but open," is how he described Google's approach. ®

More about

TIP US OFF

Send us news