HPE Discover 2023: Everything You Need To Know About HPE’s AI Public Cloud Service

HPE Tuesday burst into the burgeoning public cloud AI market with HPE GreenLake for Large Language Models, a public cloud service that for the first time provides supercomputing in a consumption-based cloud model. Here is everything you need to know about HPE GreenLake for Large Language Models, which was unveiled at HPE Discover 2023.

HPE Enters Generative AI Market With A Public Cloud Supercomputer Service

Hewlett Packard Enterprise Tuesday burst into the burgeoning public cloud AI market at HPE Discover 2023 with HPE GreenLake for Large Language Models, a public cloud service that for the first time provides supercomputing in a consumption-based cloud model.

HPE GreenLake For Large Language Models, a form of generative AI, will be powered by HPE Cray XD Supercomputers with Nvidia H100 GPUs. It will will be available by the end of the calendar year starting in North America and in Europe early next year.

HPE said it is already accepting orders for the service, which will be available in colocation facilities starting with QScale in Canada.

HPE is providing a wide range of AI Large Language Model services including strategy, design, operations and management of AI workloads.

“HPE GreenLake for Large Language Models allows our customers to rapidly train, tune and deploy Large Language Models on demand using a multitenant instance of our supercomputing platform, truly a supercomputing cloud combined with our AI software,” said HPE Executive Vice President and General Manager of High Performance Computing Justin Hotard in an HPE webcast.

HPE GreenLake for Large Language Models’ multitenant model provides capability computing services that are not readily available in “existing data centers or in public cloud,” said Hotard.

The HPE public-cloud-based GreenLake for Large Language Models marks a return to the public cloud for HPE, which shuttered its Helion Public Cloud eight years ago. At that time, HPE moved to focus its resources on managed and virtual private cloud. It also puts HPE into the fast-growing public cloud AI market with Microsoft, Amazon Web Services and Google Cloud.

Until now supercomputers have not been available on demand in a consumption model, said Hotard. “What we are announcing today is that HPE is entering the AI cloud market,” he said.

The new service provides the ability for customers to upload their own text and image data to the public-cloud-based service, said Hotard. “Users can upload their own data, ensure it is protected, and train and tune their own customized model solely for their use,” he said.

The HPE public cloud Large Language Model offering includes German AI startup’s Aleph Alpha’s highly regarded Luminous natural language capability.

“Luminous was actually trained using HPE supercomputers and HPE’s AI software and has already been implemented by various organizations in health care, financial services and in the legal profession as a digital assistant,” said Hotard.

The Luminous offering supports English, French, German, Italian and Spanish.

HPE President and CEO Antonio Neri in a prepared statement called the new offering part of a “generational shift” that is every bit as transformational as the web, mobile and cloud. “HPE is making AI, once the domain of well-funded government labs and the global cloud giants, accessible to all by delivering a range of AI applications starting with Large Language Models that run on HPE’s proven sustainable supercomputers,” he said. “Now, organizations can embrace AI to drive innovation, disrupt markets, and achieve breakthroughs with an on-demand cloud service that trains, tunes and deploys models at scale and responsibly.”

HPE’s ‘Capability Computing’ Advantage

HPE is providing capability computing services that are not readily available in “existing data centers or in public cloud,” said Hotard.

HPE is also providing the “talent and expertise required to install, configure and manage these specialized computing resources,” he said.

HPE is also providing the security and expertise needed to build responsible AI models and an optimized software layer that supports strong scaling for the capability workloads over a single cluster of nodes operating as one single computer, said Hotard.

“To effectively train these bigger and more accurate models we need supercomputing,” said Hotard. “Supercomputing provides massive performance in an optimized architecture that runs as a single computer. It is what we call capability computing.”

Capability computing allows single large-scale AI and high-performance compute jobs to run on hundreds or thousands of CPUs or GPUs at once, said Hotard.

“That is very different than general-purpose cloud offerings that run multiple jobs in parallel on a single instance,” said Hotard. “However, supercomputers today are costly and complex to adopt and manage. Supercomputers also require unique data center capabilities for power and cooling. Until now supercomputers have not been available on demand in a consumption model.”

A ‘Significant’ Cost Advantage Over Public Cloud Competitors

HPE Chief Product Officer of AI Evan Sparks (pictured) told CRN that the company’s new supercomputer AI public cloud offering provides “significant” cost savings over public cloud competitors.

“When you are talking about monthlong jobs that use 1,000 GPUs and you save 20 percent, that savings goes directly to the customer, directly to their bottom line, and adds up to significant kinds of dollars,” said Sparks. “The other thing is reliability. When you are leasing [public] cloud GPUs, you pay for them whether they work and solve your problem or not. You might get a node that fails partway through and so on and you paid for the time it took to run your job up until that point, even if the end result is no good for you at this point.”

HPE is also not charging data egress fees, which can drive up AI public cloud bills. “We realize that complex multistage jobs are going to involve data transfers between our cloud and the public cloud and so on,” said Sparks. “So a supercomputing capability is one small very important part of a much bigger workflow. So we need to enable those workflows with as low friction as possible.”

HPE is providing a “very powerful alternative to the public cloud” for Large Language Models, said Sparks. “We think this is going to be foundational toward that next generation of computing that we are entering right now,” he said.

As to the specific cost savings HPE has seen in initial trials of its public cloud AI offering, Sparks said, “We have a program where we do a total cost of ownership approach with our customers in these kinds of settings and the savings are significant.”

A ‘Land Grab’ For AI Workloads

Sparks, the former CEO of Determined AI, a highly regarded machine learning AI company that was acquired by HPE two years ago, said the new HPE public cloud offering amounts to a “massive” market opportunity.

“This is exciting because we are intersecting that market as it is really becoming massive,” he said. “Don’t get me wrong. It is also going to be a highly competitive space going forward too. There’s going to be a bit of a land grab.”

Sparks said he has been “surprised” by how fast the Large Language Model AI market has taken off. “One of my first conversations with Justin Hotard after I joined HPE was around how big a deal I thought LLMs were going to be,” he said. “I think I underestimated it by a factor of 10 or 50. It has been really an incredible growth story over the last couple of years here.”

Sparks said HPE has the right strategy, people and intellectual propery to win a “significant chunk” of the public cloud AI land grab.

“We have to be convinced that we have the right strategy, which I am convinced that we do, and that we have the right portfolio of people and intellectual property to really win a significant chunk of this market and I think we can,” he said.

Early Interest In HPE GreenLake for Large Language Models

HPE is already seeing a lot of customer and partner “interest” in the HPE GreenLake for Large Language Models offering, said Hotard.

“We have been doing a small development cloud testbed for a number of months now and we have gotten a lot of positive feedback on requirements,” said Hotard.

That said, Hotard conceded that there will be workloads where customers will deploy their own high-performance compute clusters on-premises.

“We acknowledge that and that is why we have such a robust offering of supercomputing systems, a complete stack that a customer can source from us today on an on-premises basis as well as a robust set of services,” he said.

HPE is offering the public-cloud-based AI offering in part to meet the needs of on-premises customers to burst into the public cloud, said Hotard.

“Part of why we are offering this is an extension of choice,” said Hotard. “We have customers that run on-premises and some of them are looking for bursting capability. We see this as complementary. Potentially, the cluster they have may have a certain configuration or size and this gives them opportunity to scale.”

The HPE GreenLake For Large Language Models offering is targeted at enterprise and public sector customers and startups anxious to accelerate their AI-based services, said Hotard.

“We see many companies that are looking to use AI to deliver a new experience in a segment or market,” said Hotard. “We believe that those customers will also be respective customers for this service.”

HPE Has Yet To Determine Pricing For HPE GreenLake For Large Language Models

HPE has not yet determined pricing for HPE GreenLake For Large Language Models, said Hotard.

“We’ll study market benchmarks for pricing; there is a lot of data available,” said Hotard. “Pricing is going to be driven by sizing and use of the resources as a core driver.”

Pricing will also be driven by the value of the AI natural language capabilities HPE is bringing to market including Aleph Alpha’s Luminous natural language capability, which is part of HPE GreenLake for Natural Language, said Hotard.

In fact, Hotard said part of the pricing will be determined by how Aleph Alpha intends to price its offering as part of the solution. “Part of it is how they intend to price,” he said.

“We intend to have best-of-breed models so we’ll also allow partners to make their models available with their commercial terms,” said Hotard.

HPE is not intending to offer a credit card swipe public cloud payment method out of the gate but is also “not ruling it out” down the road, said Sparks.

HPE Brings Its AI Prowess To AI Large Language Models

HPE GreenLake For Large Language Models brings HPE’s unique AI software stack to customers, which has powered large-scale AI supercomputing models for years, said Hotard.

“We have years of experience in scaling performance and tuning supercomputers to support any kind of compute and data-intensive workloads, and we are committed to making our supercomputers accessible for everyone to capitalize on AI,” said Hotard.

Neri has made AI-based supercomputing a mainstay of HPE’s strategy to drive data insight with its GreenLake edge-to-cloud on-premises service.

Among the game-changing AI acquisitions under Neri are the $1.4 billion acquisition of supercomputing pioneer Cray in 2019; the purchase in 2021 of Determined AI, which makes a software stack to train AI models; and the acquisition earlier this year of Pachyderm, which makes open-source software for large-scale AI applications.

HPE GreenLake for Large Language Models comes with HPE seeing a “significant uptick” in AI-based solutions, resulting in $800 million in “incremental” AI orders booked in the most recent quarter, said Neri on the most recent earnings conference call.

“Obviously what we are experiencing in AI is simply amazing,” said Neri after HPE reported an 18 percent increase in high-performance compute and AI sales to $840 million for its second fiscal quarter ended April 30. “It is breathtaking in some cases. I consider AI a massive inflection point, no different than Web 1.0 or mobile. The potential to disrupt every industry and advance many of the challenges we all face every day through data insight is just astonishing.”

HPE’s AI Software Stack Is Key To Generating ‘Trustworthy’ Models

Key to the HPE GreenLake For Large Language Models is a proven and tested HPE purpose-built AI software stack that leverages the HPE Cray supercomputer, said Hotard.

The HPE Cray programming AI stack offers developers tools to create, debug and tune code to optimize AI applications, said Hotard. HPE is also making available open-source tools optimized for building and deploying models for unstructured data.

The HPE Machine Learning development environment, which is part of the offering, helps “rapidly train and scale models such as generative AI models with intelligent hyperparameter optimization among many other compelling features,” said Hotard.

HPE Machine Learning Data Management, meanwhile, provides customers with the ability to integrate, track and audit data with “reproducable AI capabilities,” said Hotard.

“Having these capabilities helps generate safe, reliable data to ensure trustworthy and accurate models,” said Hotard.

HPE provides support for open-source and proprietary models, said Hotard.

“This integrated stack means that enterprises can bring their requirements and their data to quickly train and deploy Large Language Models using HPE GreenLake for LLM,” said Hotard.

Chat GPT 4.0 Is Not On Road Map

Support for Chat GPT 4.0 is not on the current road map for HPE GreenLake For Large Language Models, said Sparks.

“In terms of the set of models that are going to be available, we don’t currently have plans to offer [Chat]GPT 4.0,” said Sparks. “However, that could evolve as partnership discussions evolve. We are taking an approach where we are looking for best-of-breed models to serve various industries and sectors.”

HPE is already in “deep discussions” with pharmaceutical industry companies, said Williams. The partnerships will vary in each industry based on who are the “best partners” to work with, said Sparks.

An ‘Initial’ Direct Sales Model For HPE GreenLake For Large Language Models

HPE GreenLake For Large Language Models will initially be offered only as a direct sale by HPE, said Hotard.

“The initial service will be sold direct,” said Hotard. “Over time we’ll evaluate potential other routes to market.”

That said, Neri himself pledged on the first day of HPE Discover at the Partner Growth Summit that partners will be included in the edge-to-cloud platform powerhouse’s “bold” AI future.

“You will see some very bold moves we are making but also including you in that journey because I realize not everybody will be capable or in a position to be part of that AI journey,” said Neri in a Discover Partner Summit keynote interview with HPE Senior Vice President of Worldwide Partner Ecosystem Gilles Thiebaut.

One reason for the direct sale model out of the gate is HPE is focusing on “large leadership-class kind of jobs, so they are going to be bigger and they are going to involve a more involved kind of customized sales process,” said Sparks.

“It’s currently a direct offer based on the specific nature of the use cases that require a supercomputer—not just traditional HPC,” said Sparks. “That doesn’t mean we won’t consider including partners in this ecosystem with availability in the future. But it is a new area that we are entering and it is going to take some time for us to mature the ecosystem around that.”

HPE is already working with partners on a wide range of AI offerings including a GreenLake high-performance compute as-a-service offering and GreenLake for Machine Learning Development in a bundled hardware offering, said Sparks.

The HPE Ezmeral data fabric, which has been revamped as a more robust AI platform, is also available to partners. The new HPE Ezmeral platform, in fact, has dramatically increased the ability to on-board partners to build out AI solutions.

“We remain absolutely committed to our partner ecosystem. We have always been a channel-focused company,” said Sparks. “As we roll out these new capabilities we want to continue to empower these partners to leverage and build on this innovation”

In fact, Sparks stressed that the public cloud AI offering is a “fraction” of the overall HPE AI portfolio. “We still have a lot of other offerings where partners can do extremely well by selling through the kind of more traditional channels,” he said. “Again, we’ll continue to explore how we can better enable the partner ecosystem as we move forward and this business matures.”

As to the overall partner AI opportunity, Sparks said HPE is “working hard” to enable the channel to build profitable AI businesses.

“Our partners have access to Tech Talks and in-person AI seminars and our Data Science Summits and our certification programs— all really to help them really develop their skills in this environment,” he said. “This also includes foundational seller programs to help this community ramp up on their domain expertise because it is obviously one of the fastest-growing markets that any of us have ever seen.”

The First Of A Number of New HPE AI Services

HPE GreenLake For Large Language Models is the first in a number of new HPE AI offerings coming in the future including climate modeling, drug discovery, financial services, manufacturing and transportation, said Hotard.

“AI—especially generative AI—is increasingly becoming a mainstream topic,” said Hotard. “AI is at an inflection point and at HPE we are seeing demand from various customers beginning to leverage generative AI. We believe the technology will drive a foundational transformation in the IT market on the scale of Web 1.0.”

HPE also firmly believes that AI will be a “force for good,” said Hotard. “Empowering people with AI means we can increase productivity and advance humanity,” he said. “AI is already being applied to make discoveries. By further augmenting AI to train more data and build larger models, we can increase accuracy of predictions and solve problems faster to accelerate time to impact and time to value.”

Among the advancements likely to come from the use of AI, said Hotard, is speeding up drug development to identify “viable treatments quickly; clean energy breakthroughs with more renewables and increased carbon capture; better prediction and advancement of catastrophic weather events; and improved financial fraud detection.”