Tech Giants Say AI PCs Will Change The Industry As They Hunt For ‘Killer Apps’

While many AI tools and services are easily accessed through desktop applications and powered by cloud infrastructure, major players in the PC ecosystem believe there will be a substantial market in the coming years for PCs that enable AI experiences partially or entirely through the computer’s local processor. CRN dives into the topic for AI PC Week.


There may come a day soon when popular AI applications are processed less in cloud data centers but mainly by personal computers instead, and some of the world’s largest tech companies believe this will represent a “sea change” in the industry and set off a new wave of demand for PCs.

When pioneering generative AI tools like the ChatGPT chatbot and Dall-E image generator arrived last year, they kicked off a frenzy of hype and spending by businesses around cutting-edge AI capabilities with the belief that these technologies could significantly improve productivity, make operations more efficient and allow them to introduce new products and business models.

[Related: HP CEO: First AI PCs Will Land In Second Half Of 2024, Adoption Will Take ‘Some Time’]

But while many of those tools and services are easily accessed through desktop applications and powered by cloud infrastructure, major players in the PC ecosystem believe there will be a substantial market in the coming years for PCs that enable AI experiences—especially those in the realm of generative AI—partially or entirely through the computer’s local processor.

“We see the AI PC as a sea change moment in tech innovation,” said Intel CEO Pat Gelsinger in his keynote at the company’s Intel Innovation event in September.

Those rallying behind the so-called AI PC movement include some of the biggest names in the computer industry, as CRN reports for AI PC Week in December. They include Microsoft, with its AI-enabled Copilot services already invading Windows and several apps, as well as major PC vendors like HP Inc., Lenovo and Dell Technologies.

Just as important are the semiconductor companies like Intel and AMD, which are building the necessary chip architectures to handle the processing of cutting-edge AI applications, as are the dozens, if not hundreds of ISVs building AI-powered apps and features.

“The next step change for PCs is how the silicon uses that performance and efficiency to bring entirely new, more personalized AI experiences right on the device,” said Áine Shivnan, vice president of engineering for Qualcomm, at its Snapdragon Summit in October where the company revealed its plan to take on Intel and AMD with an AI-enabled processor for Windows PCs.

But while these companies believe AI PCs will boost device demand for years, they will need to make a strong case for why individuals and businesses should get computers tuned for AI experiences when customers can already access state-of-the-art capabilities from more powerful systems in the cloud, according to the leader of a major solution provider in Canada.

“There’s a high curiosity about it. But the question really is, for the way we are using AI currently, whether it is through Microsoft Copilot or [ChatGPT owner] Open AI, it all seems to be cloud-hosted. I want to understand what are the benefits in localizing AI to your own device,” said Harry Zarek, president and CEO of Ontario-based Compugen, No. 62 on CRN’s 2023 Solution Provider 500 list.

“What’s really going to be important is all of the software vendors demonstrating real benefit from having an AI-enabled PC versus the way we’re doing it currently,” he told CRN.

Vendors Show How AI PCs Will Work

As Intel’s leader, Gelsinger sees a deeply personal appeal in AI PCs: a better way of hearing and understanding the world around him, including those who are most dear to him.

“One of my favorite sounds is hearing my granddaughter call me, ‘Papa, Papa Pat.’ And if it were not for my hearing aids—and I have a family, almost every one of them has lost their hearing—I might not be able to hear that in the future,” Gelsinger said on the keynote stage at this year’s Intel Innovation.

To demonstrate how AI PCs could address this issue, Gelsinger pointed to Rewind, an app that turns a computer into what is essentially a personal ChatGPT.

“Rewind is a personalized AI powered by everything you’ve seen, said or heard,” said Dan Siroker, co-founder and CEO of Rewind, on stage with Gelsinger. “The way it works is it captures your screen and your audio, it compresses it, encrypts it, transcribes it, and stores it all locally on your PC. And then best of all, you can ask any question of anything you’ve seen, said or heard.”

While Siroker and Gelsinger demonstrated that the app can use a powerful cloud-based large language model like OpenAI’s GPT-4, it can also tap into a model processed by the PC’s processor—in this case Intel’s upcoming Core Ultra chip set to launch for laptops next week—to perform similar work.

Using Meta’s open-source Llama 2 model running on Core Ultra, Siroker asked Rewind what Gelsinger’s favorite sound is. Since Rewind had been recording during the keynote, it responded three seconds later: “Pat’s favorite sound is his granddaughter’s voice calling him papa.” He then asked Rewind to summarize Gelsinger’s keynote, and it provided a three-paragraph response shortly after.

Later in the keynote, Gelsinger used a Core Ultra-powered laptop to showcase another AI PC use case with personal appeal: a proof-of-concept by Intel and hearing aid maker Starkey Laboratories that allowed Gelsinger to use his hearing aids to focus on audio coming from his laptop or from around him.

But it didn’t stop there. While Gelsinger was on a video call with an Intel employee, the app informed him that a person near him was trying to get his attention, and it prompted him to switch the hearing aids to ambient mode so he could hear the other person.

Once Gelsinger returned to the laptop, he realized the Intel employee was still talking, albeit in French, but the app quickly and automatically provided the executive with a translated summary of what his subordinate said while he was away.

The app was not only recording, detecting and transcribing Gelsinger’s surroundings through multiple audio channels, it was also tracking Gelsinger’s gaze direction to notice what he was focusing on, and it was all made possible by the Core Ultra’s neural processing unit.

About a month later, Lenovo made its own case for AI PCs with a demonstration of an AI assistant—powered by a computer’s processor and trained on private, local data—that could provide a more personalized response as fast as if not faster than a cloud-based AI model for creating travel plans.

“Very soon on your AI PC, you will be able to build a local knowledge base, run a personal foundation model, perform [augmented reality] computing and use natural interactions with it,” said Lenovo Chairman and CEO Yuanqing Yang at the company’s Tech World 2023 event in October.

Why Vendors Think AI PCs Are The Next Big Thing

When Qualcomm detailed its plans to compete with Intel and AMD with its upcoming Snapdragon X processor for AI PCs, HP CEO Enrique Lores delivered a succinct explanation for what many vendors see as the core benefits of running AI applications on local hardware versus the cloud.

“The ability to run generative AI applications locally will enable more personalized experiences, improve latency, provide better security and privacy protections, and reduce costs. And all of this is good news for the customers we serve because it is going to make them more productive than ever before,” he said.

Privacy has been an oft-cited need for businesses that don’t feel comfortable putting proprietary information in the cloud to feed into AI applications. And the cloud costs of inference, the process in which an AI model makes a prediction or generates a response, can be exorbitant for large language models and other massive models due to the amount of computing power required.

“A lot of IT organizations either don’t have the existing infrastructure, or they don’t easily have line of sight to the budget required to go build out that infrastructure or pay for that infrastructure,” said Matt Unangst, senior director of commercial client and workstation at AMD, in an interview with CRN.

“Looking at how we move some of those solutions onto the client device and port some of those AI models to run on the client device is, I think, a key trend we’re going to see over the next couple of years,” he added.

But that doesn’t mean the cloud won’t have a role to play in AI PCs. At the same Snapdragon Summit event where Lores spoke in October, Microsoft CEO Satya Nadella stated that AI applications at large will rely on some combination of processing in the cloud and in PCs.

“I think that we are literally going to have lots and lots of applications, which will have local models, will have hybrid models. And that, I think, is the future of AI going forward,” he said.

With Microsoft Copilot acting as its “marquee experience” for AI PCs, Nadella said he sees the AI service becoming the “orchestrator of all your app experiences.”

“For example, I just go there and express my intent, and it either navigates me to an application or it brings the application to the Copilot, so it helps me learn, query, create and completely changes, I think, the user habits,” he said.

The Rise Of AI Chips For PCs

What’s making the rise of AI PCs possible are new processors coming from Intel, AMD, Qualcomm and Apple that come with dedicated engines for handling AI workloads.

Qualcomm and Apple were the first to release processors with these dedicated elements a few years ago. In Qualcomm’s case, this is the neural processing unit inside its Snapdragon chips, and in Apple’s case, this is the neural engine inside its M-series chips for Mac computers.

AMD, on the other hand, earlier this year debuted its first processors with its AI engine, the Ryzen 7040 series for laptops. And Intel plans to launch its first chips with its neural processing unit—the Core Ultra series, previously known as Meteor Lake—next week for laptops.

These dedicated AI engines, which exist alongside the CPU and GPU on the same chip, are designed to offload from the CPU a variety of AI and machine learning workloads, from voice recognition to video analysis and even large language models, at low power, which can improve a laptop’s battery life.

“There will be a significant portion of Microsoft Copilot that can and will take advantage of our Ryzen AI technology,” AMD’s Unangst said. “You can imagine what I’ll call a digital assistant, the ability to auto-summarize meetings or documents that we think is going to be really, really useful for business users.”

But the capabilities of AI PCs won’t only rely on a processor’s dedicated engines for all AI workloads. There may be cases where AI applications may benefit more from the CPU or the GPU.

Intel, for instance, has said that Core Ultra’s NPU is best suited for “sustained AI and AI offload” as the CPU has a “fast response ideal for lightweight, single-inference low-latency AI tasks.” The chip’s GPU, on the other hand, comes with parallel computing and throughput capabilities that make it better for “AI infused in the media, 3-D applications and the render pipeline.”

The ability to run AI workloads on the CPU and GPU means that a significant number of computers can support these applications today, thanks to the proliferation of frameworks and tools such as Intel’s OpenVINO, which optimizes AI models for a variety of processor types.

In many cases, these capabilities are coming to PCs through new software updates at an increasingly rapid pace. For instance, AMD in October enabled support of the PyTorch machine learning framework for two of its top Radeon GPUs while Nvidia released a software library that significantly increased inference performance of large language models for its RTX GPUs.

Much of this work is setting the stage for what vendors said will become the norm with PCs that infuse AI into many aspects of personal computing.

“If you think about the not-too-distant future, you’re not running one [AI] copilot. You’ll have a copilot doing transcription, a copilot running translation, a copilot creating automated imagery, a copilot filling in the gaps in what you’re talking about with contextual information,” said Dell CTO John Roese in a July meeting with investors.

## The Search For ‘Killer Apps’ Is On

While the first wave of computers branded as AI PCs have hit the market this year and more are to follow in the coming months, vendors backing the movement said they are still looking for a plethora of AI experiences that will make these devices essential.

At the Intel Innovation event in September, Gelsinger told ISVs that they have a key role to play in ensuring the success of the AI PC.

“We’ve always gone through this question of what’s the killer app? And my simple answer to that is you. You’re going to be the ones creating these next-generation applications and use cases,” he said.

To incentivize development, Intel has launched what it has called the “industry’s first” AI PC Acceleration program, which provides developers with a bevy of engineering, design and marketing resources. More than 100 ISVs are already in the program, including Adobe, Zoom Video Communications and Cisco Systems.

AMD, on the other hand, has launched a contest that will reward a $10,000 prize to a developer who creates a compelling application for Ryzen-powered AI PCs.

And vendors are offering these incentives on top of tools and frameworks that are designed to accelerate development and enable AI applications on these new types of devices.

While vendors are betting newly available and forthcoming hardware on the promise there will be a wide swath of compelling AI experiences available for PCs soon, they believe it’s necessary to get businesses and consumers prepared for what’s coming next.

“As we’re thinking about PC purchases now, typically we want to think about a three- or four-year PC life cycle,” AMD’s Unangst said.

“Businesses, IT decision-makers, they’re looking at that now going, ‘Hey, should we future-proof our decision around a purchase today to make sure that we can take advantage of some of these capabilities that we may not know every detail [about] today but we’re very confident they’re going to be there as a part of the overall set of software and experiences over the next couple of years?’” he added.