Components peripherals News

Nvidia's Ian Buck: A100 GPU Will 'Future-Proof' Data Centers For AI

Dylan Martin

'By having one infrastructure that can be both used for training at scale as well as inference for scale out at the same time, it not only protects the investment, but it makes it future-proof as things move around,' says Buck, Nvidia's head of accelerated computing, in an interview with CRN.

How is the pandemic changing that demand for AI? I know there's a lot of research that needs to be accelerated right now for very important purposes. For the overall landscape, what are you seeing?

It's funny. I was talking to someone about this earlier in the week, and some of the supercomputing centers, they're actually having a problem, because they're seeing spikes in demand. Apparently researchers aren't going to meetings anymore. Or they spend more time on other things. They're at home submitting jobs to supercomputers to do their work. From a data center perspective, you can see a difference, but it's actually been quite healthy in the sense that people are able to do their work and have been able to submit their jobs, wherever they are. They don't need to be in the data center to do their work. And that that continues.

Certainly the interest and demand for AI continues to be important for home assistants, for teleconferencing, for social media. These are all areas where you and I interact every day and even more so now that we're home. And those same services hit the data center, hit the cloud, and, of course, need AI to process the data and make decisions, whether they be recommenders or content filtering and such things. Things have been fine from that standpoint with COVID. There's obviously other impacted businesses and industries. But, overall from a datacenter standpoint, we've been fine.

Does Nvidia anticipate how a post-COVID-19 "new normal" in society could impact demand for AI?

We'll have to see how that plays out. I'm not going to predict the future of the pandemic per se. The focus right now has been about helping find maybe not a cure but at least a treatment to help de-risk the impacts of COVID. Certainly we're seeing, and you saw with the White House COVID response, which Nvidia has joined, finding drug candidates that can help understand the COVID-19 virus better, understand the different mechanics of how the virus infects a cell and understand which drugs could be used or deployed to help mitigate the effects if not cure it. We simply need to create a treatment that can make COVID less fatal — and we can all feel a little bit better about going out of our homes.

To do that, we just need to understand the dynamics of how the virus works and the literally billions of drug candidates, or substructures of drugs, could be used to interact with that. It takes a very long time, but Nvidia is no stranger to this kind of simulation. This is a molecular dynamics simulation that was actually one of the first tenets of our GPU computing strategy, [which] was to help the research community with molecular dynamics simulations in applications like Gromacs and Amber, which were largely pioneered by the academic community and have been GPU accelerated for 15 years now. We've been contributing to with supercomputing sites around the world to help accelerate and apply the technology to further optimize some of the applications. And there's a whole bunch of work going on in that area. So right now, that's the focus. Certainly, you could see a future where we can learn from this and build a smarter, early warning system to understand and detect these kinds of epidemics earlier, perhaps deploy them in smart city scenarios.

Learn More: CPUs-GPUs | AI
Sponsored Post