AWS CEO Talks New AI Offerings With Nvidia, Anthropic And Partners

‘AWS is heavily investing in all areas of AI for customer innovation,’ says AWS CEO Adam Selipsky. Here’s five big new AI offerings AWS launched recently, including with Anthropic, Nvidia and a new generative AI competency for channel partners.

AWS CEO: We’re ‘Heavily Investing In All Areas Of AI’

It’s been quite a month for the world’s largest cloud company as AWS launched a slew of new AI products and offers with the likes of Nvidia and Anthropic.

On the partner front, AWS recently launched a new generative AI competency for AWS partners aimed at capturing more GenAI market opportunities, while also forming a partnership with channel superstar Accenture to better drive AI adoption.

“AWS is heavily investing in all areas of AI for customer innovation,” said AWS CEO Adam Selipsky recently on LinkedIn. “That includes Amazon Q, our AI powered digital assistant, Amazon Bedrock, our fully managed service for accessing the latest foundation models, and our own custom designed chips.”

[Related: Here’s The Annual Public Cloud And SaaS Bill In 2024: Report]

The Seattle-based cloud giant generated $24.2 billion in revenue during Q4 2023, meaning AWS now has an annual run rate of $97 billion.

CRN breaks down the five biggest artificial intelligence and generative AI initiatives from AWS this month that partners, investors and customers need to know about.

Anthropic’s New GenAI Claude 3 Sonnet And Haiku Models Now On Amazon Bedrock

AWS CEO Selipsky recently said his company currently has over 10,000 active customers already on Amazon Bedrock.

Bedrock is AWS’ flagship generative AI platform that lets customers build and scale GenAI applications with Amazon and third-party foundation models such as Anthropic Claude.

“Anthropic’s new Claude 3 Haiku model is now available on Amazon Bedrock!” said Selipsky this week. “Haiku is designed to be the fastest and most cost-effective model on the market for its intelligence category, answering queries with lightning-fast speed. Haiku also has image-to-text vision capabilities like other new Claude 3 models, meaning they can analyze images, documents, charts and other data formats.”

In addition to Anthropic’s new Haiku AI model on Bedrock, AWS also said Anthropic Sonnet is now available on Bedrock.

“Available today on Amazon Bedrock: Anthropic's new, high-performing Claude 3 foundation model, Sonnet!” said AWS CEO. “We’re building on our deep collaboration with Anthropic through these additions, and we know customers will love the next-gen models which demonstrate impressively advanced intelligence.”

Click through to read the four other AWS AI announcements in March 2024 you should know about.

AWS And Nvidia Co-Innovating On AI With New Offerings

AWS’ goal is to double down its co-innovation with Nvidia “as we continue making AWS the best place to run Nvidia GPUs in the cloud,” said CEO.

In one of Nvidia’s biggest launches at its GPU Technology Conference this week is its new Blackwell GPU platform, claiming it will enable up to 30 times greater inference performance and consume 25 times less energy for massive AI models.

Nvidia’s Blackwell platform is coming to the AWS cloud. Aws said it will provide the Nvidia GB200 Grace Blackwell Superchip and B100 Tensor Core GPUs on its cloud platform.

“Not only will we offer Nvidia’s new Grace Blackwell GPU platform running seamlessly on AWS, we’re integrating it with our industry-leading networking, virtualization, and security capabilities—making it possible for customers to build and run multi-trillion parameter large language models faster, at massive scale, and more securely than anywhere else,” said Selipsky.

Additionally, AWS will soon offer new Nvidia Grace Blackwell GPU-based Amazon EC2 instances and Nvidia DGX Cloud to accelerate performance of building and running inference on multi-trillion parameter LLMs.

“Nvidia and AWS engineers are also joining forces to co-develop an AI supercomputer built exclusively on AWS for Nvidia’s own AI R&D. It will feature 20K+ GB200 Superchips capable of processing a massive 414 exaflops,” said AWS CEO.

New AWS Generative AI Partner Competency

On the AWS partner front, the cloud giant recently introduced its new AWS Generative AI Competency designed to feature partners who have technical proficiency and a track record of customer success when implementing GenAI.

“Leveraging technologies such as Amazon Bedrock, Amazon Q, and Amazon SageMaker JumpStart, these partners have deep expertise building and deploying groundbreaking applications across industries,” Selipsky said. “Our AWS Generative AI Competency Partners will make it easier than ever for customers to innovate with enterprise-grade security and privacy, choice of foundation models, generative AI applications, and a high-performance, low-cost infrastructure.”

The AWS Generative AI Competency will include AWS Partner Network (APN) Technology Partners and APN Services Partners.

Software path partners will have to show their proficiency in either: Generative AI applications, foundation models and application development; or in infrastructure and data.

Services path partners will need to show proficiency in end-to-end generative AI consulting.

AWS, Accenture And Anthropic Join Forces To Drive Enterprise GenAI Adoption

Channel partner powerhouse Accenture is partnering with AWS and Anthropic to help customers in highly regulated industries adopt and scale generative AI technology.

This collaboration enables organizations to access Anthropic’s AI models via Amazon Bedrock, services and accelerators from Accenture to customize and fine-tune Anthropic’s Claude models to support specific needs, including Accenture’s new generative AI switchboard.

Teams across Accenture, Anthropic and AWS will also help clients with prompt and platform engineering, provide best practices on how to deploy customized models on Bedrock or Amazon SageMaker.

This move looks to enhance AWS and Claude's capabilities for industry-specific applications in areas like knowledge management, forecasting and analysis and regulatory document generation.

“Our collaboration with AWS and Anthropic brings everything together for clients to move faster from experimentation to value realization,” said Karthik Narain, group chief executive of technology at Accenture in a recent statement. “We see the biggest potential for our model customization service in highly regulated industries, where AI models need to be tailored to adhere to specific compliance, accuracy and safety requirements.”

Additionally, more than 1,400 Accenture engineers will be trained to be specialists in using Anthropic’s models on AWS, allowing them to provide end-to-end support for clients deploying generative AI applications.

Amazon SageMaker Integration With Nvidia NIM

Another big AI new offering this week with Nvidia: the integration of Amazon SageMaker with Nvidia’s NIM inference microservices.

The integration looks to drive the development of generative AI applications and new use cases. Customers can leverage the new offer to quickly deploy foundation models (FMs) that are pre-compiled and optimized to run on Nvidia GPUs to SageMaker, reducing the time-to-market for generative AI applications.

“AWS is leading the way in helping customers build and scale generative AI applications with easy access to the latest and best foundation models, and choice of chips for their workloads,” said Selipsky.

The goal of Amazon SageMaker integration with Nvidia NIM microservices is to help businesses further optimize price performance of foundation models running on GPUs.