Google Cloud’s OpenAI Deal ‘A Win For TPU Chips,’ Microsoft Partner Says

‘Make no mistake, this is also a win for Google Cloud against Microsoft Azure,” says one Microsoft partner top executive. ‘[OpenAI] has continued, all year long, to loosen its dependency on Microsoft.’

The AI wars are continuing as Microsoft-backed OpenAI has signed a deal with Google Cloud to provide the ChatGPT owner with more cloud computing infrastructure and services.

“This is a win for [Google’s] TPU chips,” said one top executive from a solution provider that partners with Microsoft on a global basis who declined to be identified. “And make no mistake, this is also a win for Google Cloud against Microsoft Azure. … [OpenAI] has continued, all year long, to loosen its dependency on Microsoft.”

Until January 2025, OpenAI was using Microsoft as its exclusive data center provider. However, OpenAI CEO Sam Altman this year has said a lack of compute capacity had delayed several products.

For example, Altman said this week that the release of OpenAI’s first open AI model in years would be delayed. “We are going to take a little more time with our open-weight model, i.e., expect it later this summer but not June,” he wrote this week on social media platform X.

[Related: Broadcom Dumps Registered VMware Resellers; ‘Raising The Bar Across The Program,’ Says Channel Chief]

“OpenAI needs more capacity for their gigantic AI demands. For them to go with an actual real competitor like Google, it just shows how much they need their custom chips for AI—things like ChapGPT need more capacity and power to innovate and reiterate on,” said the executive. “Not even Microsoft can handle that demand alone it seems.”

In March, OpenAI signed a deal worth billions of dollars with fast-growing data center provider CoreWeave to provide more cloud computing capabilities. In addition, OpenAI this year signed partnerships with Oracle and SoftBank on its $500 billion Stargate infrastructure project.

Google TPU Chip Innovation

Google has invested millions in creating its own custom AI chips, including its in-house tensor processing units (TPUs), which has helped Google win over other AI startups such as Anthropic.

Google Cloud TPUs are custom-designed AI accelerators optimized for training and inference of AI models. They’re ideal for use cases such as agents, code generation, media content generation, synthetic speech, recommendation engines and personalization models.

Google has its own DeepMind AI lab that competes directly against OpenAI around developing large language models (LLMs) for AI customers.

“Hats off to Google. They made the right R&D moves to win over a customer with lots of money to spend,” said the executive.

In April, Google unveiled its seventh-generation TPU, dubbed Ironwood, specifically designed for AI inference.

OpenAI’s Google Cloud New Deal

OpenAI’s new deal with Google Cloud has reportedly been in discussions for months, according to a Reuters report, and was closed in May.

Microsoft and OpenAI are also negotiating to revise the terms of their multibillion-dollar investment, including how much stake Microsoft will hold in OpenAI in the future, according to Reuters.

“[OpenAI] will be spending a ton of money on Google. They have $10 billion in sales. There’s money to be made here by Google,” the solution provider executive said. “We’re a big Azure shop, but I can see Google making some headwinds with its chips and cloud business overall because of the money they’re going to make on OpenAI. ... But we’re still big believers that Microsoft will be the leader in the AI era.”