Besides HPC applications with large data sets, are there any other applications or use cases Intel is looking at this year for Xeon and Optane?
I want to drive further with the team into some of the standard baseline virtualization applications that can benefit from larger memory sizes. They may not all need persistence, and the continued utilization and investment in persistence will happen over time. It's an ecosystem change that will take awhile to happen—and that's fine. We'll just keep pushing it and driving it. But I really think there's a lot more to be done, and that there's a lot more capability in people’s hardware they're landing in their data center than they've turned on, so we're always working with customers—and this might sound counter-intuitive—to increase their server utilization.
Now you can say, ‘Well, if they only use a server 30 percent of the time and then they buy another one, that's good for you,’ which it is. But by and large, we want them to use their capacity more fully and then turn on new use cases and eventually it all works out that they buy more. Like if we had the mindset that every time you offer customers more efficiency, they buy less, we would have never invented virtualization. We would have, because if you say, ‘Oh my gosh, you can load four instances onto one server—end of the world’—[and it turned] out to be one of the greatest accelerants of our business over time. So we don't look at those things with fear. We look at them as opportunity. Like the more we enable, the more we grow.