Dell Client Solutions Group President Sam Burd Sees ‘Bright Future’ For PCs
‘I look at the phase that we’re in now, we’re arming accelerators, processors and sensors on the device so it becomes increasingly intelligent. So I think we’re going to have apps that will live in and access the cloud, but putting capability on the device equals a better experience,’ Burd tells CRN during Dell Technologies World 2025.
Dell Technologies President of Client Solutions Group Sam Burd said PC users need and are demanding more performant devices, just as the endpoint becomes the great migration point for AI work at home and at work.
“I look at the phase that we’re in now, we’re arming accelerators, processors and sensors on the device so it becomes increasingly intelligent,” he told CRN during Dell Technologies World 2025. “So I think we’re going to have apps that will live in and access the cloud, but putting capability on the device equals a better experience.”
He called out Microsoft’s approach with Copilot Plus that lets users access the cloud but also take advantage of capabilities on the device to improve latency and security and lower costs.
“It’s your data on the device, which is also really appealing,” he said. “[Dell Technologies CEO Michael [Dell] talked this morning about AI coming to where the data is which, when you think about the edge, that is where all this data is created. Having that kind of performance and that architecture on the edge makes a lot of sense.”
[RELATED: New PCs, Servers To Drive ‘Breakthrough’ Year Unveiled At Dell Technologies World 2025]
At this year’s show, Dell introduced the industry’s first notebook with an enterprise-grade discrete NPU, a Dell Pro Max with two Qualcomm Cloud AI 100 chips inside, together capable of running more than 450 tera operations per second (TOPS) and 109 billion parameter models locally on the device.
Dell is also rolling out Dell Pro AI Studio, a framework that gives IT shops governance over which models are approved, where they run and who can use them, along with a security layer designed for AI PCs.
“When I look at what the ISVs are doing, they’re going to take advantage of the horsepower and NPUs that we put in the device,” he told CRN. “So I think there’s a bright future for the PC. I think there’s a bright future for infrastructure, and there’s a bright future for technology in general.”
Burd has worked at Dell since 1999, rising to lead the company’s Client Solutions Group, which last year achieved full-year revenue of $48.4 billion and continued to hold its No. 1 market position in several categories, including in commercial AI PC sales with 27.7 percent of the market, in North American commercial sales, in PC workstations and PC monitors, and in overall PC business by revenue.
Here is more of what Burd had to say in an interview with CRN.
You just introduced the industry’s first discrete NPU inside a notebook—this is a Qualcomm chip capable of inferencing massive AI models. If this is the beginning of the race, that only seems to lead to more and better outcomes for the end user, is that right?
Yes. And I think we’re seeing AI become really important on the endpoint.
What we’re doing with the PC inference card is really just adding more and more capability at the edge. We’re starting to see Copilot Plus apps from Microsoft. We talked about them a year ago. They’re now out on the latest releases of the OS running on the OS. We have ISVs adding intelligent apps.
But I think the really exciting thing to me is businesses being able to run their own apps on the data that’s sitting on the edge on the PC. We’re starting to see that here. Not every company is ready to go deploy that immediately.
But to me, the more important thing is people start thinking about what’s possible and go, ‘This AI agenda that I have in a company that we believe is going to make my team, and the company, better. I should be planning that out with the infrastructure person and edge, and I’m going to run AI workloads across all that. And while these PC edge devices are going to be really capable in the future, I need to be building that into how I’m thinking about the equipment I’m buying, how I’m going to architect applications, etc.’
That’s the exciting piece, and the opportunity right now, like you said—it’s fun to be at the early days of something exploding. And same thing [if you] think back to CPU performance in the beginning, and it was like leaps and bounds.
When it comes to agentic AI, Dell CTO John Roese (pictured) has said 80 percent of agentic work will happen outside the data center. What happens when it gets to an endpoint like an older PC or device that can’t support it as it is moving through systems to carry out tasks. Or am I thinking about that all wrong?
I think endpoints will get to be able to support that. I think the agents will run. John probably described a hybrid kind of world of one device.
But like you, we might as well take advantage of the hardware that you have on a device. So we’re going to show tomorrow our view of what a world would look like with agents on a device that help you basically just go get work done.
You think about the support experience today or the process of optimizing a system—if you're a home user or in a corporate environment how all that stuff works together.
We see a world where agents reside on the device. Agents reside on private clouds in the enterprise. Agents are on kind of infrastructure equipment that sorts that stuff out behind the scenes for the user.
I think that’s exciting. There’s a lot to work out. Like, how are they going to go work with each other? How do they work with other agents in the environment?
But the promise to me is we’re going to see a day where you just expect that in the systems and the products that you buy. I want them to work. I want them to self-resolve. I don’t want to have to be the family IT expert or my own personal expert in setting this stuff up.
We hear the PC refresh is underway, I know you are in a quiet period, but for our readers are there signs that the refresh has arrived and it’s time to lean in on that?
I think if we look at the last quarter, if you look at [research firm] IDC, just results on growth, the mature geographies had strong growth in the commercial space. I would say we are seeing growth in the PC business. We’re seeing companies refresh.
In talking to companies here, and I talk to them all the time outside being at Dell Technologies World, they’re sorting out how they get their PCs moved over. I was just with one company who’s almost done. I was with another one who’s just starting on that whole process.
Mileage varies, but people understand the October date [of Windows end of life]. They're working to get their users migrated. They’re also looking, saying, ‘Wow, this technology is really impressive.’
Battery life matters a lot to customers. If you think about x86 architecture, it has improved by leaps and bounds over the last couple of generations. I would say the Qualcomm parts we launched a year ago were outstanding.
We’ve seen x86 improve a whole lot. So that’s good for end users. And then it is going, ‘Wow, I can get AI and 40 TOPS kind of devices thrown into the mix with what I go buy now.’ So they’ve kind of put all that together.
We’re seeing growth in the PC business, which has taken us a little while to see a couple quarters now of growth. It’s pretty robust and mature geographies, which tends to mean the refresh is happening.
If you talk to Microsoft, there’s still a lot of PCs to refresh, though. That means we’re going to go past October with more systems that need to be bought. But I also see IT thinking about, ‘I need to put great devices in people’s hands and more capable devices.’
So this also isn't like, ‘Hey, buy now and I’m done forever,’
When you get three years down the road, we’re going to have a much better device there that’s going to allow people to do more. And investing in that technology is thankfully for all of us that work in the space really part of that equation for companies being successful— how they’re arming their people with the technology to do the job better, keep their customers happier and win.
You mentioned some of the different processors that Dell is using inside its PCs, do you still see silicon diversity—whether it’s Intel, Qualcomm or AMD—being a part of the lineup going forward?
I think we see there’s a competitive world around silicon, and it’s been good. To me, competition and the desire to go win and do better stuff for customers is really good. So we’re going to ship the best parts for our customers.
We’re always looking at whatever options are out there, and you see choices in our product lineup. I think Intel’s Lunar Lake products are really good. There are good parts from AMD; there are good parts from Qualcomm. And I’m excited with the road map that we have. We’re going to keep looking at things, and we’re going to put the best products out there for our customers.
Dell is No. 1 in commercial PCs and then there are other categories where Dell is not No. 1. Where do you see the channel playing a role in helping Dell bridge that gap?
The channel plays a big role today. So if I think about our commercial business, or the whole PC business, if you look at the revenue of our business it’s larger than anyone else in the PC space, and we have a significant amount of that business that flows through our channel partners.
So you were at the [Partner] Summit earlier. We work with that team to have them up to speed on the products that we’re offering to engage our customers. We sell stuff direct. We sell stuff through the channel. We meet our customers how they want to engage with us.
The channel has been a part of this business since I've been at Dell for 26 years, and the channel has been a key part of our business from when we were a lot smaller than today to where we are now. And, I think, trailing 12 months, [Dell has] a little bit under a $50 billion PC business. And I expect that to be the case in the in the future.
I see partners that are able to engage customers in different verticals to extend the reach and take the great products we have and turn that into solutions for our customers. So I’m excited about that kind of partnership now and [what it will] look like in the future.
Dell PCs went through a massive lineup change earlier this year, consolidating several lines into a few naming conventions: Dell, Dell Pro, Dell Pro Max lineup. Are you all in on the new nomenclature?
I’m definitely all in on the new nomenclature. I will say we thought about it a whole lot because some of these names have been around for pushing 30 years. But our whole goal was to make it easier for our customers today and make it easy for prospective customers to understand our lineup.
And we think we did that with names that basically call stuff what they are. Whether it’s the brands in ‘Pro,’ which is professional grade for our commercial customers. ‘Max’ is maximum performance for our workstation and huge performance-oriented customers. We put tiers in place, three tiers that are the same kind of naming across each of those product lines.
It makes it a lot easier for our customers to decode. It makes it a lot easier for our teams to decode. We had, as you’d expect, like anyone that's been doing something for 27 years, to hit the reset button and go, ‘OK, here’s a new way of looking at that.’
But feedback has been very good, whether it’s new people coming at Dell or our current customers going, ‘Make it simple for me.’ I think it’s worked really well.
You have PCs that can run LLMs locally, but they’re not coming installed on the device from what I’ve seen. Is that coming? Are we going to get PCs that have LLAMA 4.0 already installed, or is that something where you let the customer decide that?
If you think about the apps that people are going to run on their device—so you can think about Microsoft Copilot Plus putting models on the device—ISVs are doing the same thing.
What we are doing, and we’re going to talk about more with Dell Pro AI studio, is basically how do we make it easy? What you describe is the part people can do. An AI developer can go download models and run them on the device.
Where we see they have more of a challenge is when they go, ‘OK, I want to run this model in a workflow on the device,’ and then they have to go spend time with IT.
So,they spend time with security, and go, ‘Well, how are you going to keep that updated? How are you going to manage that model on the devices?’ All the things we’ve been doing on software and other apps that run on the device, you have to go do that on models.
The AI Studio does that; it basically abstracts the type of devices. So you go, ‘I can take the model and what I want to run in a workflow. I can run that across different types of silicon, different types of devices, and I can use the infrastructure we have to keep all that software up to date.
Apps, they’re going to work. They’re going to have models. Third-party ISVs will put those models on the device. When companies want to do it, their developers can do it. We need to make it easy for them to then deploy the models across their PCs.
And I think, like we talked about in the beginning, that’s the super advanced stage right now. We’re at where people are just starting to see, ‘OK, these things are showing up in Copilot Plus. And now I can finally see this is good. I really like this.’
They’re going to be building as we get down the road, and we’re talking a year from now, or two years from now, we’re going to see companies are running these AI workloads on the device, and they’re going to need that abstraction to go, ‘OK, how do I take it? Go from the model builder now trying stuff out to getting that broadly deployed?’ And that’s the tools we’re putting in place to make that easier.