AMD’s OpenAI Deal A ‘Major Validation Moment’ For Chip Designer: Analyst

While a leading Wall Street analyst says AMD’s new Instinct GPU deal with OpenAI should throw out ‘any lingering fears around’ the chip designer, a channel partner questions whether the company has much to gain right now from expanding Instinct into the channel.

A leading Wall Street analyst on Monday called AMD’s new multibillion-dollar Instinct GPU deal with OpenAI a “major validation moment” and said the agreement should throw out “any lingering fears around” the chip designer.

“With a 10 [percent] stake in AMD this quickly brings [AMD CEO] Lisa Su and AMD right into the core of the AI chip spending cycle and is a huge vote of confidence from OpenAI and [OpenAI CEO Sam] Altman,” wrote Daniel Ives, managing director and senior equity research analyst at Wedbush Securities, in a post to X.

[Related: Analysis: After Big Nvidia Win, Will Intel Ever Escape Its Rival’s Shadow?]

AMD announced this morning that it landed a “multi-year, multi-generation” partnership with OpenAI that will see the ChatGPT maker deploy 6 gigawatts of Instinct GPUs. The company also said that it has issued OpenAI a warrant for up to 160 million shares of AMD common stock that will vest upon the completion of certain milestones.

Lisa Su, AMD’s CEO, said in a morning webcast that the deal—which will begin with a one-gigawatt deployment of Instinct MI450 GPUs in the second half of 2026—will generate “tens of billions of dollars” in the coming years and could spur additional revenue of more than $100 billion from other customers during that period.

The news sent AMD’s stock price up by more than 28 percent Monday.

While AMD debuted its Instinct GPUs all the way back in 2017, the company has spent the past few years significantly ramping up investments in research and development for its Instinct portfolio to take on Nvidia. The rival’s dominance of the AI computing space was cemented in 2022 after OpenAI debuted ChatGPT, which at the time relied on Nvidia GPUs.

Despite the major investments it has made in Instinct and related products, AMD’s data center AI revenue has lagged Nvidia’s by several magnitudes, having made more than $5 billion in Instinct sales last year in contrast to the $102.2 billion its rival made from data center compute products during roughly the same period.

However, Su (pictured above) said on the Monday webcast that the OpenAI deal represents a “major inflection point for us,” with each gigawatt of the planned six-gigawatt Instinct deployment amounting to “significant double-digit billions of revenue” for the chip designer.

While this will allow AMD to achieve its goal of tens of billions of data center AI revenue by 2027, the agreement will also have a “compounding effect” that could result in the company making significantly more money from other customers, according to the CEO.

“We also believe that with the massive scale of this deployment and the strong benefits to the overall AMD AI ecosystem, this partnership will enable additional revenue from existing and new customers deploying at scale and has the potential to generate well over $100 billion in revenue over the next few years,” she said.

Su expects this because the OpenAI deal is a “clear validation” of the company’s data center AI road map, which AMD last year moved to an annual cadence of new GPU releases from its previous strategy of launching new products roughly every two years.

An AMD executive told CRN earlier this year that the company is not ready yet to make Instinct a broader channel play as part of its expanded partner program.

The reason for this is AMD is focused on “high-touch” engagements with its biggest customers—including OpenAI, Microsoft and xAI—to ensure they have an optimal experience, according to Kevin Lensing, who runs Americas and hyperscaler sales for AMD.

The chip designer is also refining the software stack, including the recently launched ROCm Enterprise AI, to ensure channel partners can make repeatable sales and integration motions with Instinct-based systems, Lensing added.

“The challenge with doing a channel enablement on Instinct is we can’t enable a model where we have to go one-to-many if we can’t touch them all and deliver a great experience,” he said back in June.

While Lensing couldn’t commit to a timeline at the time for when Instinct GPUs will become a broad channel play, he said AMD’s new partner program is set up to support the product line and expansions into other product categories in the future.

“With this new overall structure that we’ve rolled out, it lends itself to extensibility, to other product lines and to new ways to incentivize on top of the base. That’s the whole concept,” he said.

Alexey Stolyar, CTO of AMD systems integration partner International Computing Concepts, told CRN on Monday that there could be benefit from AMD making Instinct a channel-ready product sooner but questioned whether the chip designer has the resources to do so and if channel revenue would be enough to justify the investment for now.

Based in Northbrook, Ill., International Computing Concepts has credited Nvidia and its strong partnership with the AI infrastructure giant as a major driver for the 376.5 percent revenue growth the systems integrator experienced over the past two years. This resulted in the partner debuting in the No. 1 spot on CRN’s Fast Growth 150 list this year.

Stolyar said his company has pushed AMD to go faster on making Instinct viable for the channel, but he understands why the chip designer needs to focus on large customers like OpenAI to develop its largest sources of revenue.

“The channel will probably drive innovation, like different use cases, different tools and so on, because there’s such a wide variety of things [happening in the space]. And so the question becomes: By driving this innovation, do they get a leg up or not? And can they afford to do it now or not?” he said.