Search
Homepage Rankings and Research Companies Channelcast Marketing Matters CRNtv Events WOTC Avaya Newsroom Experiences That Matter Cisco Partner Summit Digital 2020 Cyber Resilience Zone HPE Zone The Business Continuity Center Enterprise Tech Provider Masergy Zenith Partner Program Newsroom HP Reinvent Digital Newsroom Hitachi Vantara Digital Newsroom IBM Newsroom Juniper Newsroom Intel Partner Connect 2021 NetApp Digital Newsroom The IoT Integrator Intel Tech Provider Zone NetApp Data Fabric WatchGuard Digital Newsroom

Google Cloud Unveils 9 New Tools At First Data Cloud Summit

‘We fundamentally want to change…how companies are thinking of data from a technology-centric view to an ability-centric view,’ says Gerrit Kazmaier, Google Cloud’s new general manager and vice president for data analytics, databases and Looker.

Back 1 ... 2   3   4   5   6   ... 7 Next
photo

Analytics Hub

Analytics Hub is a fully managed service built on BigQuery that efficiently and securely exchanges data and analytics assets across organizations. Slated to be available in preview in the third quarter, it allows data providers to control and monitor how their data is being used and enables organizations to create and access a curated library of internal and external assets, including unique datasets from Google.

“It ushers in a new era for the world of data- and insight-sharing,” said Debanjan Saha (pictured), general manager and vice president of data analytics in Google Cloud. “This is not your grandfather’s data-sharing platform. Analytics Hub allows companies to publish, discover and subscribe shared data assets. It allows publishers to create exchanges that combine unique Google data sets with commercial, industry and public datasets, as well as data sets that belong to these organizations themselves. Publishers will be able to queue their data exchanges internally and externally, and they’ll be able to view aggregated usage metrics on how popular their exchanges are, and that’s very, very important.”

While the cloud makes data sharing across organizations much more viable, doing so at scale still can be very challenging, according to Saha. Traditional data sharing requires the extraction of data from databases, storing it into flat files and then transmitting them to consumers where they can ingest that into another database. This results in multiple copies of data and unnecessary costs, especially when a customer has multiple petabytes of data sets, he said.

“Also, traditional data-sharing techniques bypass data governance processes,” Saha said. “As providers of data, how do you know...how (that) data actually (is) being used and who has access to it? If you want to monetize your data, how do you know how to manage subscriptions and entitlement? Altogether, these challenges mean that organizations are unable to realize the true potential to transform the business with data sharing.”

BigQuery, Google Cloud’s fully managed, serverless and multi-cloud data warehouse, has had cross-organizational, in-place data-sharing capabilities since its inception in 2011.

“Over the last seven days, 3,000 different organizations shared more than 200 petabytes of data using BigQuery,” Saha said. “Data sharing in BigQuery is already very, very popular, and we wanted to make it easier and even more scalable, and that’s why we built Analytics Hub.”

Google Cloud is not calling Analytics Hub a data exchange by design, according to Saha.

“Our vision is bigger than just sharing data,” he said, citing BigQuery ML and its built-in machine learning capabilities that are used by more than 80 percent of Google Cloud’s top 100 customers. “This opens it up for more opportunities, as more organizations can not only share data, but also can share machine learning models, rich, dynamic dashboards, etc.”

 
 
Back 1 ... 2   3   4   5   6   ... 7 Next

sponsored resources