Intel Highlights Data Center Wins Ahead Of AMD EPYC 'Rome' Launch

'While the customer challenges are complex, we really think this strategy of delivering platform-based solutions is what's best going to meet their critical business challenges,' Intel exec Jennifer Huffstetler said of the company's data-centric platform strategy.


With the server processor market about to heat up, Intel is highlighting recent customer and ecosystem wins and reiterating the advantages of the company's data-centric platform strategy with its expanding portfolio of CPUs, memory, accelerators and software.

"Workloads are continuing to diversify, the customer needs are evolving, and that's why our portfolio is as it is today," said Jennifer Huffstetler, vice president and general manager of data center product management and storage at Intel, Santa Clara, Calif.

[Related: Jason Kimrey: Intel’s Data-Centric Platform Strategy Is A Winning Hand For Partners]

Sponsored post

Intel Monday also inked a new partnership with Lenovo aimed at accelerating artificial intelligence and high-performance computing innovation by bringing together Lenovo’s portfolio, including its TruScale Infrastructure, with Intel technologies.

Huffstetler’s comments come as rival AMD prepares to launch its second-generation EPYC server processors August 7.

Intel’s Xeon processor family is likely to face increased competition from one of its top rivals in the data center market as AMD is projecting a "significantly faster" sales ramp of its EPYC "Rome" processors over the first-generation product.

But Intel is operating on an entirely different level, according to Huffstetler. On top of holding more than 90 percent of the x86 server processor market, the company is chasing a total addressable market of $300 billion, which is much larger than the $50 billion market the company touted a few years.

"While the customer challenges are complex, we really think this strategy of delivering platform-based solutions is what's best going to meet their critical business challenges," Huffstetler said.

Solution providers working with Intel have said the company’s data-centric strategy has created a sea change for how partners talk about the chipmaker's products and how they can address customers' pressing data needs.

"We've done a good job of selling Intel over the years, but we didn’t knock on anyone's door to talk about Intel's strategy; it was inherent to what we were doing," Scott Miller, senior director of strategy partnerships at World Wide Technology, recently told CRN. Now, with the company's data-centric strategy, "we're actually selling Intel as a brand," he said.

Customer Wins For Optane DC, DL Boost

Huffstetler pointed to Intel's Optane DC persistent memory as an example. After launching alongside Intel's second-generation Xeon Scalable processors in April, Optane DC now has a pipeline of more than 200 ongoing proofs-of-concept with large enterprises, she said. That includes Google Cloud, Microsoft Azure and T-Systems, which are all using Optane DC to optimize their SAP Hana workloads.

While the company recently announced a partnership with SAP to optimize its applications on Optane DC, Huffstetler said Intel's persistent memory solution is also being used to improve the performance of other enterprise applications like Redis, Hadoop and VMware vSphere and ESX.

"We're seeing more traction in what we call the density use case, where with more memory footprint now you can add more containers or more [virtual machines] to a system," Huffstetler said.

Another platform-based solution gaining momentum is Intel's Deep Learning Boost, a new feature in Intel's second-generation Xeon Scalable processors that boosts artificial intelligence workloads. The feature is already supported by several of the leading AI frameworks, which includes Caffe, mxnet, ONNX, PyTorch and TensorFlow.

Huffstetler said several software vendors are now shipping AI applications that have been optimized to run on DL Boost, including, Digitate, Presagen and Cloudwalk. These vendors are among the more than 200 companies participating in the Intel AI Builders Program, which aims to help them optimize their AI applications on Intel architecture.

"These builder partners are the ones that help us achieve the dozens of deployments today across a range of industries that represent millions of end customers," she said.

Amazon Web Services is using DL Boost to improve AI inference workloads on its C5 instances by a factor of nearly four over instances that don't use the feature, according to the executive. Chinese tech giant Baidu is also getting behind Intel's AI efforts with its early support of DL Boost and a partnership to co-develop Intel's Nervana Neural Network Processor for Training.

Huffstetler said this early momentum points to the fact that AI will eventually become an important component in all enterprise applications.

"It's really not a question of will they include AI. It's a question of when," she said. "So that's why these differentiating technologies like DL Boost that we're integrating into the core Xeon product are going to help future-proof that infrastructure for our customers moving forward, as they can't really predict which application is going to land on that infrastructure."

Xeon Platinum 9200 Series Finds Momentum

Dave Hill, product marketing head for Intel's Data Center Group, said while hardware is one way Intel improves performance, software also plays a major role.

As an example, he said, Intel's Xeon Platinum 8280 processor provides up to a 14X improvement in inference throughput performance while the Xeon Platinum 9282 processor provides up to a 30X boost over the previous-generation Xeon Platinum 8180 thanks to new software optimizations.

"It's just a deep investment in software on top of our hardware that really drives the customer applications to performance levels that are generally a ton more than you would just get out of direct hardware improvements," Hill said.

Hill said the Xeon Platinum 9200 series, which are the highest-end processors in the second-generation Xeon Scalable lineup, has largely found traction with high-performance computing customers, primarily in government, though there has been some interest from commercial HPC customers as well.

"The majority of it has been in HPC environments where the customers just need the absolute highest performance gains possible," he said.

Other recent customer wins in Intel's Data Center Group include Ericsson, which is using Intel's 10-nanometer "Snow Ridge" wireless access chip for 5G base stations; Nokia, which is using Intel's Quick Assist Technology to boost network performance; and Rakuten, which is using Intel's Xeon Scalable processors and FPGA accelerators to build a "virtualized end-to-end, cloud-native mobile network."