Nvidia ‘Doubling Down’ On Partners With DGX Cloud Service
Nvidia executives Manuvir Das and Craig Weinstein explain to CRN how the GPU giant will rely on channel partners ‘to a significant extent’ for its new DGX Cloud supercomputing service, designed to help enterprises create and run generative AI applications, and how it will open a range of services opportunities.
Nvidia executives told CRN that the GPU giant is “doubling down” on the channel with its new DGX Cloud supercomputing service, which will open a range of services opportunities for partners that will “move at the speed of light” because of the focus on software.
In an interview, Manuvir Das, Nvidia’s vice president of enterprise computing, said the Santa Clara, Calif.-based company will rely on partners within its Nvidia Partner Network “to a significant extent” to sell DGX Cloud, which gives enterprises quick access to the tools and GPU-powered infrastructure to create and run generative AI applications and other kinds of AI workloads.
Das said Nvidia’s go-to-market plan with partners for DGX Cloud builds on the work the company has done in the channel with its DGX systems, which combine Nvidia’s fastest data center GPUs with the Nvidia AI Enterprise software suite and serve as the underlying infrastructure for the new cloud service.
“This is about Nvidia truly growing up as an enterprise company and saying that we’ve learned in the last few years the value of this ecosystem and now we’re doubling down with this ecosystem,” Das said.
DGX Cloud is hosted by cloud service providers, with initial availability on Oracle Cloud Infrastructure. The service is expected to land at Microsoft Azure in the third quarter, and Nvidia said it will “soon expand to Google Cloud and more.”
Das said DGX Cloud will have multi-cloud and hybrid-cloud capabilities thanks to Base Command, Nvidia’s software that manages and monitors training workloads and allows users to right-size the infrastructure for what their applications require.
At launch, each DGX Cloud instance will include eight of Nvidia’s A100 80GB GPUs, which were introduced in late 2020. The monthly cost for an A100-based instance will start at $36,999, with discounts available for long-term commitments. DGX Cloud instances with Nvidia’s newer H100 GPUs will arrive at some point in the future with a different monthly price.
While Nvidia plans to offer an attractive compensation model for DGX Cloud, Nvidia Americas Channel Chief Craig Weinstein said the cloud service will make services offered by partners “even more valuable” because DGX Cloud shifts the opportunity from hardware to software.
“Instead of our partners spending so much time building data centers, the quality of the work is going to happen a lot faster. So the services opportunity for our partners is probably going to move at the speed of light and will increase sequentially over time,” said Weinstein.
An executive at one of Nvidia’s early DGX Cloud partners, cloud-focused MSP SADA Systems in Los Angeles, said there is no vendor better positioned to seize the generative AI opportunity with enterprises than Nvidia, and he already sees big potential with the new cloud service.
“I’m redirecting business and technical development resources from other areas to the relationship with them as a result of DGX [Cloud]. It’s really material for us,” said Miles Ward, CTO at SADA.
What follows is a transcript of CRN’s interview with Das and Weinstein about the “significant” opportunity for partners with DGX Cloud, how the cloud service helps enterprise build generative AI models with proprietary data and what kinds of services opportunities it will create for partners.