Analysis: How Nvidia Showed Its True Power In 2023

CRN explains how Nvidia seized on this year’s AI boom to become omnipresent among the channel’s largest server and cloud vendors and earn more data center revenue than Intel.

If you were a partner attending conferences hosted this year by the channel’s largest data center and cloud infrastructure providers, you would have noticed that all but one put 2023’s biggest name in AI computing front and center on the big stage.

No, not OpenAI CEO Sam Altman. We’re talking about the leader of the company whose GPUs made the snappy responses of OpenAI’s ChatGPT possible: Nvidia CEO Jensen Huang.

[Related: Nvidia CEO Explains How AI Chips Could Save Future Data Centers Lots Of Money]

Huang was at Amazon Web Services re:Invent 2023. He was at Microsoft Ignite 2023. He was at Google Cloud Next 2023. He was at Dell Technologies World 2023. He was at Lenovo Tech World 2023. He was at VMware Explore 2023.

Huang wasn’t at Hewlett Packard Enterprise’s Discover 2023 events, but the server vendor nevertheless announced a “significantly” expanded partnership with the GPU designer at its Barcelona affair with Huang’s top enterprise computing lieutenant on stage.

Nvidia’s omnipresence among the channel’s largest server and cloud service providers is just one example of the company’s growing influence in the IT industry, which increasingly believes AI will become an important facet of many applications and services.

Another way to demonstrate Nvidia’s power is to compare its data center revenue against that of the industry’s largest and longest-running provider of general-purpose server CPUs: Intel.

While Intel’s data center revenue shrunk to $3.7 billion in the first quarter of this year, Nvidia’s data center revenue grew and surpassed its rival in roughly the same period to $4.3 billion.

Nvidia’s data center revenue then grew 141 percent to $10.3 billion the next quarter and then another 41 percent to $14.5 billion in the third quarter, more than double and triple what the company made for those two periods in the previous year. Meanwhile, Intel’s rose slightly to $4 billion before sinking down again to $3.8 billion over the past two quarters.

This put Nvidia’s data center revenue for the first three quarters of its 2024 fiscal year—which ran from Jan. 29 to Oct. 29 of this year—at $29.1 billion. That’s more than 2.5 times higher than the $11.5 billion in sales Intel earned for data center products between Jan. 1 and Sept. 30.

Nvidia has said it expected that total revenue in the fourth quarter, which includes other segments like gaming and visualization, will reach $20 billion, plus or minus 2 percent, with “strong sequential growth” expected for data center segment sales.

Intel, in the meantime, expects fourth-quarter revenue to reach between $14.6 billion and $15.6 billion, driven in part by “moderate sequential growth” within its Data Center and AI Group.

Barring any unexpectedly large deviations, these forecasts mean Nvidia will almost certainly finish its current fiscal year with far more full-year data center revenue than Intel.

What Nvidia’s skyrocketing data center revenue shows is how lucrative it’s been for the company to sell powerful and expensive GPUs for demanding AI and data science workloads as the IT industry seeks to turn the hype around generative AI applications like ChatGPT into long-lasting business opportunities that transform the way we work and play.

But Nvidia hasn’t succeeded on the strength of its GPUs alone. These processors are backed by a rapidly expanding software suite, which includes everything from software development kits to orchestration platforms, and they’re increasingly used in systems and motherboards designed by the company in concert with other parts it makes, like CPUs and data processing units.

This is Nvidia’s “full-stack computing” strategy at work, providing the most critical aspects that are needed to run AI workloads and others in the realm of what the company calls accelerated computing. And many of these components within Nvidia’s stack have been adopted by the channel’s largest data center and cloud players, AWS and Dell included.

The company’s runaway success is what allowed its stock price to skyrocket by 240 percent to roughly $488 per share on Friday versus $143.15 at the beginning of the year.

But while Nvidia has dominated the fast-growing AI computing space, the company is facing greater competition than it ever has. And it’s coming while the chip designer builds up GPU capacity to serve businesses who have faced several months of shortages due to high demand.

The companies targeting Nvidia’s data center business include its largest cloud computing allies—AWS, Microsoft Azure and Google Cloud—who are building out accelerator chips and complementary technologies to compete against Nvidia-powered instances.

There are also scores of startups, backed by hundreds of millions of dollars from venture capital firms, designing novel chip architectures they believe are better suited to accelerate AI workloads than Nvidia’s GPUs, which were originally made for graphics processing.

Then there is Intel and AMD, who have fiercely fought over x86 CPU market share for the past several years and are now seeking to challenge Nvidia’s dominance in the GPU space.

But it’s one thing to build the right combination of hardware and software to serve a market opportunity that will continue to grow fast for the next several years. (Research firm IDC, for instance, said this week that spending on generative AI solutions alone—not counting other kinds of AI workloads—will grow, on average, 85.9 percent every year to $151.1 billion in 2027.)

What’s equally important is that Nvidia and other players in the AI computing space work hand-in-hand with channel partners to seize on the long-standing relationships solutions providers have held with businesses, governments and other organizations to find new opportunities.

And while there are plenty of partners who have benefited from Nvidia’s meteoric rise in the data center and cloud infrastructure markets, the growing competition from the likes of Intel, AMD and others will lead to broader opportunities for the channel in the AI computing space.