Search
Homepage Rankings and Research Companies Channelcast Marketing Matters CRNtv Events WOTC Jobs Cisco Partner Summit Digital 2020 Lenovo Tech World Newsroom Dell Technologies World Digital Experience 2020 HPE Zone Masergy Zenith Partner Program Newsroom Intel Partner Connect Digital Newsroom Dell Technologies Newsroom Fortinet Secure Network Hub IBM Newsroom Juniper Newsroom The IoT Integrator Lenovo Channel-First NetApp Data Fabric Intel Tech Provider Zone

Intel, AMD, Ampere and Nvidia Leaders on Oracle’s Next-Gen Cloud Lineup

Next year, Oracle will outfit its public infrastructure cloud with cutting-edge processors from Intel, AMD, and Ampere; and its new Nvidia A100 powered GPU platform comes online next week.

Back 1   2   3   ... 5 Next
photo

Nvidia Enables Cutting-Edge AI

Next week, Oracle Cloud will make generally available its new GPU platform powered by Nvidia A100s.

The accelerated compute instances, developed with feedback from customers increasingly adopting artificial intelligence, look to differentiate from competitors in their ability to network large clusters running on bare-metal servers. Oracle also will add the option to attach up to 25 terabytes of local storage to those instances, and up to 2 terabytes of memory, Magouyrk said.

Customers can create clusters of up to 512 GPUs to train large machine learning models, he said.

Nvidia CEO Jenson Huang congratulated Oracle on the launch in a discussion with Magouyrk.

Nvidia’s data center business has been doubling every year “off of really large numbers,” Huang said, thanks to the recognition about a decade ago that “software trained with AI really wants to be accelerated with GPUs.”

While accelerated computing “starts with an amazing processor,” it’s also reliant on software, acceleration libraries and the ML models sitting on top of those tools.

The deep learning software propelling Nvidia’s growth is “the big bang of AI today,” Huang said.

Oracle’s unique architectural approach allows extremely large numbers of GPUs, rapidly moving data between those nodes with Nvidia Mellanox network interfaces, to effectively work in parallel churning through machine learning models.

AI researchers want “the ability to iterate on very large models to make the perfect model,” Huang said.

“We’re going to put this technology in the hands of enterprise customers all over the world. We think this is the next great adventure for us and we’re really excited to do it with Oracle,” Huang said.

 
 
Back 1   2   3   ... 5 Next

sponsored resources