Information Blizzard: Snowflake Computing Adds AWS-Based Data Loading Service To Its Cloud Data Warehouse

Snowflake Computing is making it easier for organizations to move data into its cloud data warehouse system with the launch of its new Snowpipe automated data loading service.

The development of Snowpipe is a response to growing demands for technology that can take structured and semi-structured data streaming from online and operational applications and move it into systems like Snowflake for near-real-time analysis.

"What we've been hearing from our customers is that continuous data ingestion is key," said Matt Glickman, Snowflake's product vice president, in an interview with CRN. "More data is coming in in a continuous form."

[Related: The 10 Coolest Big Data Products Of 2017 (So Far) ]

id
unit-1659132512259
type
Sponsored post

Snowflake Computing, based in San Mateo, Calif., provides a SQL-based data warehouse system running on Amazon Web Services.

The new Snowpipe service addresses the fact that more businesses are storing huge volumes of data in AWS Simple Storage Service (S3) – an accelerating trend given the rapidly falling cost of data storage – and are looking for ways to quickly tap into that data to support decision-making tasks.

Traditional data warehouse systems rely on batch-loading technology and processes to move data into a data warehouse. But the latency involved is proving to be too slow for today's fast pace of business.

Businesses and organizations are increasingly using S3 for data staging or "data lake" storage, Glickman said. And more of that data is machine-generated from operational systems and Internet of Things networks.

Data warehouse usage is also evolving from analysis of historical data to supporting customer-facing applications that require near-real-time responses and even predictive analysis.

The Snowpipe automated service "listens" asynchronously for data coming into S3 and loads it into the Snowflake data warehouse. Snowpipe is built using AWS Kinesis, the company's real-time stream processing "data pipeline-as-a-service," Amazon Elastic Compute Cloud and other AWS systems.

"We're further embracing S3 as a core cloud foundation that all Amazon customers are using," Glickman said.

Snowflake Computing works with a number of systems integration and solution provider partners including Slalom, Cognizant, Wipro, Intricity, DecisiveData, Cervello and Useready.

Partners who provide data integration and ETL (extract, transform and load) services can utilize Snowpipe as a way to manage their data streams and transformations without data staging, Glickman said. Systems integrators can use Snowpipe to lower data integration costs and focus more on high-level services for clients.

Snowflake plans to charge for Snowpipe use based on per-second compute utilization to load data.

Snowpipe is currently in a public preview stage, allowing customers to try out the system, and will be generally available in early 2018.