Databricks And AWS Extend Alliance With Pay-As-You-Go Data Lakehouse Option

With the new pay-as-you-go offering partners and customers can acquire Databricks software and standup, configure and manage a data lakehouse through their AWS accounts using the AWS Console.

ARTICLE TITLE HERE

Data lakehouse technology developer Databricks is taking its alliance with cloud giant Amazon Web Services to the next level, unveiling Tuesday a new pay-as-you-go offering on the AWS Marketplace that makes it possible to more quickly build and launch a data lakehouse on the AWS cloud.

The new option offers a faster way for solution providers and customers to set up a data lakehouse system within their existing AWS accounts, deploy and configure the lakehouse using the AWS Console, and handle administration and billing through AWS account management.

The move will provide partners and customers with more flexibility to try out the capabilities of the Databricks Lakehouse Platform and develop proof-of-concept use cases without having to commit to long-term contracts, AWS and Databricks executives said.

id
unit-1659132512259
type
Sponsored post

[Related: AWS ‘Doubling Down On Working With PAWS ‘Doubling Down On Working With Partners’: Ruba Bornoartners’: Ruba Borno ]

AWS and Databricks “are both big believers in the data lakehouse. And central to that is we both have a big belief that data and AI have been central to customers’ success in the cloud,” said Joel Minnick, Databricks vice president of marketing, in an interview with CRN.

“We’ve been looking for ways that we can continue to make it easier for our customers to get that value out of their data, really start finding much faster time-to-value with how they are putting their data to work, how they are putting their AI to work,” Minnick said.

AWS and Databricks have a long-standing partnership and Databricks’s software was previously available through the AWS Marketplace. But customers had to purchase Databricks using a set contract. And it required “a lot of hands-on set up, a lot of configuration,” Minnick said.

The new offering “streamlines the customer experience,” said Mona Chadha, director of category management at AWS, in the briefing with CRN.

Businesses and organizations can acquire a Databricks subscription using their existing AWS contracts, credentials, accounts and privileges, providing faster customer onboarding and consolidated account administration. It also provides seamless integration between customers’ existing AWS configuration and security and Databricks.

In addition to the ability to launch a Databricks-based data lakehouse within existing AWS accounts, the pay-as-you-go plan creates more flexible price options, allowing customers – and the systems integrators, resellers and managed service providers they work with – to experiment with test cases and proof-of-concept solutions before committing to longer-term contracts, Chadha said.

Customers also can launch a 14-day free trial of Databricks from the AWS console.

“It’s really allowing channel partners to provide more of a holistic solution,” Chadha said. “It also opens the door to more professional service offerings.”

The new pay-as-you-go offering follows Databricks’ April announcement that it is supporting AWS’ Graviton2-based EC2 instances that the company said can deliver up to 3X to 4X improved price-performance for data lakehouse workloads running on AWS.

Databricks also said it is investing in joint programming and go-to-market functions that align with AWS within key vertical industries. That dovetails with Databricks’ recent focus on providing data lakehouse packages for specific verticals such as financial services and retail. It also follows the company’s recent launch of the Brickbuilder Solutions initiative to help solution provider and systems integrator partners develop industry-specific data and AI solutions for the Databricks Lakehouse Platform.

“At the end of the day this is all about the data and AI,” Minnick said. “It’s about rapidly increasing the time-to-value that customers on AWS can get from a data lakehouse.”