CRN Exclusive: Intel VP Poulin On Partner Opportunities For AI In The Data Center And Optane's Multibillion-Dollar Potential

Poulin On The Record

When it comes to Intel's data center business, the growth opportunities are "almost endless" for partners, said Shannon Poulin, vice president of the sales and marketing group and general manager of the industry sales group at Intel, Santa Clara, Calif. "Think about switching and routing, legacy switching and routing. It's now turning to virtualized switching and routing," Poulin told CRN in an exclusive interview during Intel Partner Connect 2018 in National Harbor, Md.

At the conference, Poulin talked about Intel's highest priorities for its Data Center Group, the multibillion-dollar opportunity for the company's Optane memory module in data centers, how the company plans to take on artificial intelligence chip startups, and whether data center partners will have any role with autonomous vehicles and connected cars.

"If AI is early days, autonomous vehicles is also very early days, and you see the same sort of thing -- a fragmented market of non-volume economics solutions right now. We believe we can be a player to bring volume economics into that market," Poulin said.

What follows is a portion of Poulin's conversation with CRN.

How would you describe Intel's overall data center business right now?

The data center business right now is, quite candidly, phenomenal. You've seen our earnings reports. The strength of the business [can be seen as] you look at everything that's getting connected in the world. Think about the number of IP addresses that are out there. It's a very simple measure. Depending on the size of your family and the size of your house, the number of IP addresses that you have continues to go up. That drives data, that drives compute traffic and, ultimately, somebody has to move it on a network, store it somewhere and ultimately compute with it, which are all our data center business. Whether it's our device at the endpoint or somebody else's device at the endpoint, it's going to be increasingly our devices in the network, our devices in the storage area and our devices in the compute side of that world. We're definitely looking at the business right now as a huge growth opportunity.

What are the highest priorities for the data center?

I would say a few different things. One, we've certainly seen cloud technology take off. That's both on -and off-premise -- public cloud, private cloud. Transforming networks, whether that's in a wireless network [or] 4G, and we're starting to see the way people are looking at radio access networks, central offices and the way that they move data over those networks and ultimately a telecom data center. Or even a wired network that exists within a data center where you're starting to virtualize that network, where previously it was like compute before it wasn't virtualized. Now everything is virtualized in the compute space. The network right now is not virtualized, and everything is getting virtualized now.

What would you say are the growth opportunities for partners in the data center?

They're almost endless. Every place they're leaning in, from storage when you look at it, we're seeing a lot of non-volatile storage get deployed, which causes people to rethink the way that they might virtualize storage. If you think of mainframe storage, you think of big legacy [storage area networks], and the reality is, those are getting converted to virtualized storage, which is now non-volatile memory increasingly on individual server nodes, and it's just getting virtualized across multiple nodes. Same way for compute coverage-wise. Think about switching and routing, legacy switching and routing. It's now turning to virtualized switching and routing. Our partners are looking at that and saying, 'OK, how can I get in there and sell into those companies, those accounts that are virtualizing their compute, their networking, their storage all together?"

How much of a channel opportunity is artificial intelligence in the data center?

They're taking advantage of it today in the two main areas [of AI]: training and inference. The inference today is happening largely on Xeon-based servers and largely by our channel customers that are in this room and OEMs who supply to increasingly large companies [and then people that supply cloud service providers that offer AI-optimized instances]. So that's how the inference is happening today, and when [a service] recommends a book for you or a movie for you based on either your previous purchase history or somebody else's, that's a form of inference [made possible by] training that they did based on AI.

A lot of that training today is happening on competing products. And we are entering that space in multiple different ways, either with new optimized compute engines that we're putting into that space or with Xeon that we're optimizing for the training space, so that's an area for our partners to either play with a competitor or increasingly play with us. Given the relationships we have, we believe that they're eager to play with us.

Would Intel prefer to have a larger market share on the training side?

Absolutely. It's an area we are investing [in]. What we found is initially we had a general-purpose CPU that was amazing, Powers almost all the cloud in the world, almost all the inference in the AI space in the world, almost every data center that you can see in the world. What we looked at was an accelerator or optimized piece of silicon that has been one of the things that people have started out with in the training space. We're now moving into the space in multiple different ways. One of them will be with our general-purpose CPU, which only gets us so far. The next one will be with an optimized piece of silicon, which will get us even further and that will put us on a much more competitive playing field.

How does Intel plan to stay ahead of the new AI chip startups that are starting to crop up?

When we look at the AI space, it is early, early days, and we look at some of the things that go on and we say, 'Oh, I see somebody's recommending a shirt or a book or whatever." That is early, very rudimentary AI. One of the things we're doing is we're investing in technology that uses different forms of learning. Our drone business is an interesting example of technology of how you fly a drone, and how far you stay away from each other and that sort of thing. We plan to use that technology to bring it into the automotive [space]. You see drones at the Olympics and you think, 'Wow, really cool, they make a pattern, and they don't bang into each other.' It's hard to wrap your head around the fact that there might be vehicles on a roadway that move in unison because they're talking to each other and staying away from each other, and having anti-collision is really important, so applying technology that we're investing in now into future use cases is important for us. We will continue to invest, we will iterate every year, every two years, and we believe that long term we have staying power.

It's all about also optimizing our frameworks. Think about Caffe, Theano and some of the things people are doing to bring some of these frameworks together to get developers on board. That's another area where we believe in the ecosystem play that we did around compute. We invest a lot in optimizing Windows and Linux and software. We have thousands of software engineers that optimize software to run best on Intel architecture. We're going to bring that engine to bear in the AI space so that when we iterate, it's not just improving the hardware, it's improving the software and the overall experience. And we're in this in for the long game.

Is there a channel opportunity for connected cars and autonomous vehicles on the data center side?

Initially, it's going to be limited. Initially you go in and design a product specifically for a vehicle manufacturer or their ecosystem. You might work with somebody like a Bosch or somebody that supplies into the car ecosystem, like Delphi. Those are almost like car OEMs, if you think about it. They make very fixed car modules, and there's only so many car frames right now that you can make in the world. If you think about volume economics, though, you also have to think about the ability to get one thing that runs in multiple different vehicles would be important. So when we bought Mobileye and we look at compute, we think about, 'How do I get volume economics to play here?' This isn't necessarily just a one-off every time. So what we're thinking is making a platform that we would have around Mobileye or our autonomous efforts that ultimately emerges as the de facto standard. Again, if AI is early days, autonomous vehicles is also very early days, and you see the same sort of thing -- a fragmented market of non-volume economics solutions right now. We believe we can be a player to bring volume economics into that market.

How much of an opportunity is smart city technology for channel partners?

Anything that was analog that converts to digital is a good thing. If I think about a streetlight, it's pretty analog today and we don't have any silicon in the vast majority of the streetlights that are in the world. If I think about parking and it's paint and asphalt, we don't have a lot of business in paint and asphalt, right? So when I think about those markets, anything we can do as a company to help them get to digital is a good thing. By the way, even if it means the first version adopts a low-cost competitive digital offering, that's great. Because [the way] we think of it, it will generate data that will need to be moved on a network, stored on an array somewhere, computed and analyzed on some level. So for us, that's a good thing. … From a smart city perspective, we're doing some things where we're big in digital surveillance in putting any kind of compute at the edge because a lot of people don't want to move a lot of data. Like, you don't have the bandwidth to move 4K videos from tens of thousands of cameras in the city. You just don't have the bandwidth to do that. You need to put compute at the edge, which means it's not just a camera with optics -- it's actually a camera with optics and some level of AI or analysis capability, and then some ability to compress the video that you actually send. If I do facial recognition, I can push some level of facial recognition to the edge, so that I only need to send the relevant pixels back. So those kind of things, anything that turns analog to digital, is good for us and good for our channel as an industry, and then we believe that it also creates opportunities for cities to deliver services less expensively than they're doing today.

How big of an opportunity do you think Optane is for the data center from the partner perspective?

We looked at the memory market in general, and it's not just data center, it's data center plus client. When we looked at the opportunity at first, we didn't sell DRAM. So the first thing you have to say is, 'If there's a lot of silicon in the world somewhere, we might want to get in there and compete where it makes sense.' The other thing that we saw was a lot of an ability when we brought our Moore's Law performance leadership and manufacturing leadership, we felt like we could innovate and deliver greater density for a lower cost. So then we entered the market with Optane, and you're seeing it first on a client -- we have a module for the client. You're also seeing it on the solid-state disk, so a non-volatile memory-based drive to replace hard drives. And then you will see it later this year on a module that goes into a data center server. The ability to put an entire set of analytics or an entire database into memory delivers you just massive performance improvement, because then I don't need to go through the interface, from the CPU to the hard drive. I can go through a super high-speed interface that connects direct to memory and the CPU, and the performance of that is exponentially better than you could if you had to do something in a hard drive at the edge. It's not just the SSD and the HDD comparison -- that is a comparison that exists today in favor of SSDs. This is a completely different discussion. It's non-volatile memory connected directly to the CPU, replacing DRAM, which is a completely different paradigm. And we felt like we could do it cost-effectively. We felt like we could offer a good value proposition, and that's where you're going to see us push Optane in the data center.

Is that something that a lot of partners are going to be doing or will it be more of a specialty?

It will absolutely be broad. It will be a DIMM [dual in-line memory module] that you can go deploy. It is a multibillion-dollar opportunity for us and our partners. That group, if you look at our public disclosures, that group grew from two point something billion last year, three point something billion this year and we've made statements around very large double-digit growth, percentage-wise, this year.

In his Intel Partner Connect keynote, Greg Ernst, vice president of Intel's sales and marketing group, said the memory shortage is over. Could you say a little bit more about that?

First, the demand for non-volatile solid-state disks has been tremendous and there's a lot of people in that market, us and others, providing SSDs and last year there was a constrained environment. We then increased capacity, we put a dedicated fab in place, which is the first time we've had a dedicated media-producing fab since we exited the DRAM business in the mid-80s. That fab that we have is producing memory for us now, so that alleviates some of the constraint we had last year. We are still expecting something like a 60 percent increase in the number of bits shipped year over year, so it's still a staggering number.