HPE’s Uli Seibold On How HPE-Nvidia AI Factory Solutions Are Helping Partners Build Revenue Streams In A ‘Do-Or-Die’ Channel Moment
“My call to action is if you are thinking about investing in AI, do it now,” said Seibold in an interview with CRN. “Don’t wait any longer. It is so massive... It is unbelievable. There are hundreds and thousands of new names and players coming. If you do not do this, you are going to be out of the market.”
HPE Vice President of Global Hybrid Solutions Ulrich “Uli” Seibold says the next generation HPE-Nvidia AI factory offerings—including the second generation of Private Cloud AI—are helping partners build AI consulting and integration revenue streams in a “do-or-die” moment for the channel.
“My call to action is if you are thinking about investing in AI, do it now,” said Seibold in an interview with CRN. “Don’t wait any longer. It is so massive... It is unbelievable. There are hundreds and thousands of new names and players coming. If you do not do this, you are going to be out of the market.”
Seibold said the AI partner opportunity—which will be front and center at Nvidia’s GPU Technology Conference (GTC) this week in Washington, D.C.-—is moving at a much more rapid clip than the flexible hybrid cloud as-a-service opportunity that HPE has been driving for nearly a decade with HPE GreenLake.
“If I think about our Flex GreenLake business, that was an evolution,” he said. “This is happening four or five times the speed. This is massive. I have not seen anything like this in the last 40 years.”
HPE’s second-generation Public Cloud AI platform-—co-developed with Nvidia—is now available in a small upgradeable form factor for the HPE ProLiant Gen12 DL380a server with Nvidia’s new Nvidia RTX Pro 6000 Blackwell server GPUs. HPE said the new GPU delivers three times better performance and is now upgradeable.
“With the new technology with the Gen12 Server with the Nvidia RTX Pro 6000 Blackwell Server Edition GPUs with the integration with Private Cloud AI, we also have the opportunity with the new technology to provide an upgrade path,” said Seibold. “In the future customers can start small and then go to the next technology. This is a natural development we can do now. With the chipset and bus structure before, it was not possible. This is an opportunity for partners and for GreenLake as well. So customers start typically in the enterprise small. They want to pay for what they use. They can go into this as a service or in a traditional model.”
Among the new HPE offerings are an agentic AI smart cities solution built on Private Cloud AI that was co-developed by Nvidia and $16 billion solution provider powerhouse SHI.
Seibold said he sees SHI as a “role model” for how to succeed in the fast-moving AI market.
“What they provided on their home turf is the managed service, the integration service, they bring the ecosystem together, doing part of the consulting service,” he said. “I would say they are a role model. What we are doing in all our enablement workshops is talking with our partners and distributors, leveraging them to build an ecosystem. You cannot be fast enough to do all of it by yourself. You need to go into a partnership.”
HPE also unveiled a new air-gapped secure by design AI offering for the HPE Alletra X10000, providing a private and secure sovereign offering that is gaining appeal in some countries, especially in Europe.
In addition, HPE unveiled a version of the HPE data fabric that now provides agentic AI- powered governance. “That gives us a more automated, tool-based operations where the agents manage the data flow,” said Seibold. “This is a huge step forward.”
Seibold said HPE—which unveiled its first AI pilot program in May 2024—so far has trained 400 partners on how to build an AI practice through intensive workshops with about 80 to 100 of them actively selling AI solutions.
“We have an internal limitation in terms of the resources to help partners deploy and develop Private Cloud AI,” he said. “The limitation is more internal than external. We could do much more than we are doing. Internally we are unfortunately resource-constrained, but we’re working on that as you can imagine.”
Seibold said HPE is “investing heavily jointly” with partners on AI solutions to build “revenue and margin streams” for the future. “This is consulting,” he said. “It is higher than resell or getting agent fees. That is nice but the value of the partner for the future is doing the [AI] integration, the operation of it, having ISVs on their platform and doing all the changes on the platform.”
What is the biggest money-making opportunity with the new HPE Nvidia announcements including Private Cloud AI Gen 2?
It is the consulting and the integration [services]. That for me is the big thing. We are in a fantastic time. We are building the future revenue streams with and for partners.
All the announcements you heard about today relate to the AI Factory in one way or another way.
Data is coming from everywhere. From the customer’s data center. From the edge. From the hyperscalers. From all different SaaS vendors. One of the big money-makers for partners is to integrate all that data, whether it is on HPE or non-HPE infrastructure.
Then you need to do all the integration of the workflows of the data. The data fabric is also a super key element for our partners.
The important thing in our AI Factory is the control plane. Hence, we have our Morpheus and OpsRamp acquisitions. Everything goes through and is managed with the platform. So Morpheus for us is the control plane. It is independent of where the data is coming from whether it is HPC [high-performance compute], Private Cloud AI, compute power with GPUs. All those elements come together. There are storage improvements as well.
With multitenancy, control access and service catalogs, you have everything you need to manage different data streams across the world from different SaaS vendors, hyperscalers, on-prem or from the edge. That is the key asset. Then there is the management layer so you can manage it with the overview and dashboards with OpsRamp.
This is where partners can and will earn most of their money. For me, this is the essential part of this. With all the enablement and training we are doing, the partners need to be clear what their role is as a partner. Your role as a partner can be infrastructure integration. Without a modern and secure networking infrastructure, AI will not be successful.
Then if you look a step ahead it’s all about the business process integration. Some partners have started doing this kind of consulting, but very often these are different type of partners. Typically, they are the cloud-native business process consulting partners. The good or the best partners are buying or building these capabilities because that is the biggest revenue stream. For a customer that is optimizing costs integrated in their current processes or building a new business, this is a consulting effort partners need to do or they need to work within a partner ecosystem to do that.
The next big lever for partners is the data maturity model. Without a modern data structure you cannot succeed. Today we are now offering agentic AI for data fabric [which enables data governance with a unified data layer].
Data fabric is part of our Private Cloud AI. Data fabric can be used for all hardware vendors or in our own terminology in our compute, storage and in our high-performance compute technology. In the end, it is all about how to manage, how to operate and how to deploy data.
Today we are announcing this [data fabric] will be automated [with agentic AI]. We announced Agentic AI for networking at HPE Discover. Now we are announcing it for data fabric. That gives us a more automated, tool-based operations where the agents manage the data flow. This is a huge step forward.
But think about what I said before: Our partners need to build a data consulting practice to implement, to structure the data, and to decide who gets access to the data, and then if you build it you can automate it through agentic AI. So that is the next big lever. If partners are not able to build this by themselves, they need to do this through the partner ecosystem.
I just came from Madrid and had a great discussion with a distributor who can bring that ecosystem together. With the example of the smart city, SHI was the first worldwide partner for this. So they built the ecosystem with our Unleash AI partners. They built the consulting capabilities to build a smart city in Vail with traffic control, skiing events, parking tolls, weather conditions, all automated. Then you can [point] the driver with the number plate ABC go to Parking Lot B. This is all managed automatically. The value of SHI in this case is they brought this all together.
This is the enablement we have been doing for more than a year: how to bring an ecosystem together, how to build services, how to integrate software vendors on top of this.
By the way, 75 percent of all the order streams and revenue streams are going through partners. Our partners understood they can build now their future margin and revenue streams. That’s fantastic.
All the distributors—the big ones—are now investing in the technology. They have built their consulting capabilities. They want to do the proof of concept. I got a couple of commitments around this during our distribution conference.
SHI played a key role in the smart city example. So every partner needs to do this kind of integration?
Exactly. This is just one use case. There are many, many others. In the U.S., WWT, Trace3, Mark III Systems, Connection are all investing massively in the technology to become the center of a partner ecosystem driving workloads or industry-vertical-specific solutions ideally based on our Unleash AI ISVs.
What we are doing with Orange Business in France or Uniserver in the Netherlands and many, many others plus the local ISVs, we train them, test them and bring them together with the partners so we have an end-to-end solution. Then we can go horizontal or ideally industry-vertical-specific into a market.
What is the significance of the second generation of HPE Private Cloud AI now available in a smaller form factor?
With the new technology with the Gen12 Server with the Nvidia RTX PRO 6000 Blackwell Server Edition GPUs with the integration with Private Cloud AI, we also have the opportunity with the new technology to provide an upgrade path.
In the future, customers can start small and then go to the next technology. This is a natural development we can do now. With the chipset and bus structure before, it was not possible. This is an opportunity for partners and for GreenLake as well. So customers start typically in the enterprise small. They want to pay for what they use. They can go into this as a service or in a traditional model.
What is the opportunity for partners in the sovereign cloud with HPE’s air-gapped solution?
Now we are providing our platform including HPE Private Cloud AI, Alletra [Storage MP] X10000, in an air-gapped solution.
In Europe private cloud is becoming more and more important. In Europe there is a huge discussion in the press because some are not 100 percent disconnected or 100 percent sovereign. The applications and process in that case are dominated by hyperscalers.
We have a $2 billion funnel [for these sovereign clouds]. Partners can drive these air- gapped solutions. That means partners driving our platform with Morpheus, OpsRamp, the infrastructure technology, compute and storage in a secure local data center by local vendors.
Is the sales ramp with the AI solutions what you expected?
The units are starting faster than expected. The order size is a little bit below what I personally expected. Why? Because partners need first to develop their own solution.
They need to test. They need to build their strategy. Now with the upgrade possibility, we can scale not just heavily in units. Now we can in dollars as well.
What we see now is partners starting small, building their business models and then they are going big. This is maybe a normal evolution. You need to build a business. Then you can grow.
I expected personally having that higher because of the influence from the hyperscaler market and all the announcements we see with large language models.
Enterprises need to be efficient. They need to have a TCO. So partners are starting super -fast on units, then they develop the solutions.
All those will move into service provider or colocation data centers. Why? Because we are talking here about massive power and massive, massive network infrastructure. A normal partner or even a midmarket enterprise does not have the data center capabilities to have 12- or 8-kilowatt power consumption. This is the next step for our service providers: AI workloads. Fifty-three percent of my total addressable market is in services with MSPs and CSPs at the moment. Customers need to work with those service providers to be able to provide the power needed. That’s fantastic news for partnerships as well.
So how big is the colocation opportunity in the AI era?
It is massive. A normal partner will not have a data center to drive this kind of scale or have the fast internet connection. So colo and service provider is a home run. There are Digital Realty, Equinix and many, many others.
Is SHI—which did the smart city AI solution—a great example of what partners should be doing here?
Exactly. It’s the model of how to evolve in an AI market. They are not developing their own applications. That is not their home turf. That is why we linked them to ISVs. They built a model to do business integration. But they are not the biggest business process integration company, so they worked with ecosystem partners to do the consulting stuff. Then they built a data maturity model for them. They used data fabric to create a data model to see who gets access to the data, where is the data. Now we can build on that with agentic AI.
What they provided on their home turf is the managed service, the integration service, they bring the ecosystem together, doing parts of the consulting service. I would say they are a role model. What we are doing in all our enablement workshops is talking with our partners and distributors, leveraging them to build an ecosystem. You cannot be fast enough to do all of it by yourself. You need to go into a partnership.
Then ideally you need to focus on verticals where you are strong historically. From there you can expand into newer or other industries. This is not just government or health care. All industries are investing whether it is finance, manufacturing, logistics. Every partner has a home turf. They need to have a trusted community they can build the ISV ecosystem around and the consulting ecosystem around it. Then they operate it and manage it through Private Cloud AI.
What kind of growth are you seeing in Private Cloud AI from partners?
We have trained more than 400 partners. I would say 80 to 100 is the core. These are the core partners we are working with to drive business with them.
What impact will the second-generation Private Cloud AI have on these partners?
The second generation will now help them scale. Today they need to decide whether they are using a small, medium or large configuration. Now they can scale because they can go from small to medium to large. Hardware performance is not a big topic anymore.
The performance is so heavy. It is there. It is not a showstopper anymore. It is about the software. It is about the integration. It is not about infrastructure.
The good news is we have a heavy workload for Blackwell. Now it is there. We can scale. But that is not the showstopper. The showstopper today is the availability of the integration capabilities and the software.
How big a lead do you think you have given the AI Factory solutions you are bringing to partners?
In the beginning there was a lot of education. We talked about a six- to nine-month advantage in time to market [with HPE Private Cloud AI]. That is just one element.
What I realized now with all the fast changes, in particular with all the new technology Nvidia is developing from a software perspective, all the blueprints, the NIMs, the software stacks, and the predefined industry vertical solutions, partners realize if they did that picking, choosing and testing it costs money and takes time.
Now they see it is much better having that in an integration solution and with every update it comes into Private Cloud AI automatically. So they can use the 80 percent predefined industry workloads rather than developing it all by themselves.
So now they have time to focus on the industry, the market and the enterprises instead of spending time and money for pre-integration and evaluation with a new release. Partners now realize this and they are benefiting.
What kind of momentum are you seeing with Private Cloud AI?
This is fantastic. I would say every partner at the moment is talking with us about what they can do and how they can start to implement it. They want to know about price points and what is the business model they should create. At the moment, we could do more than we are able to because we need to do consulting [and training with these partners] first.
We do workshops with these partners. We define the strategy. We look at their strengths and make sure they are aligned [with the opportunity]. Then we discuss whether they should go into Private Cloud AI or compute with GPUs.
You saw the outcome with the smart city [project that was done by SHI]. That is just one example. There are others. But that [smart city example] is fantastic because it brings everything together. That includes our Unleash AI program where we integrate the ISVs and the consulting capabilities where we bring the business process partners together.
With the smart city example in Vail, you can show with the increased population in wintertime how it is a massive workload that can scale. This shows the power of a technology like this.
How do you feel about how fast Private Cloud AI is ramping in the channel with the HPE training and enablement?
We have an internal limitation in terms of the resources to help partners deploy and develop Private Cloud AI. The limitation is more internal than external. We could do much more than we are doing. Internally we are unfortunately resource-constrained, but we’re working on that as you can imagine.
We are working to have more and more people build this out with our partners. All our partner business managers across the globe are behind this. For the most relevant partners, we have coverage. But there are even more partners that want to get this training.
We are resource-constrained, not market-constrained.
How many partners have completed the training and are now selling Private Cloud AI?
We have roughly 400 partners, but our key focus is on 80 to 100 partners.
I will be able to do more over time.
Over four decades you have been through many significant market transitions including the internet boom and the cloud computing boom. Is this AI opportunity the hardest transition ever for partners to make?
It is more difficult because everything partners did in the past was on their home turf. So before they were doing IT integration services or a managed service around IT. Now it is more about business process, data maturity and ISVs. That’s a bigger lift than in the past. It is outside their home turf. That’s the big shift.
What is your message to partners about the AI opportunity?
We are investing heavily jointly with the partners, building their revenue and margin streams for the future. This is consulting. It is higher than resell or getting agent fees. That is nice but the value of the partner for the future is doing the [AI] integration, the operation of it, having ISVs on their platform and doing all the changes on the platform.
This is how partners can scale a model instead of selling through a RFQ [request for quote]. That is fine, but building business and margin streams for the future is a totally different business model.
What is your call to action to partners on the opportunity ahead, and what do they have to do to start?
My call to action is if you are thinking about investing in AI, do it now. Don’t wait any longer. It is so massive.
It is unbelievable. There are hundreds and thousands of new names and players coming. If you do not do this, you are going to be out of the market.
Is this a do-or-die moment for partners?
Yes, I would say it is do-or-die.
Is this moving faster than any other market transition you have seen?
Absolutely. If I think about our Flex GreenLake business, that was an evolution. This is happening four or five times that speed. This is massive. I have not seen anything like this in the last 40 years.
This is so massive and to be fair we are talking now about the beginning of agentic AI. We are not there. We are at the beginning of this.