Google Leverages Its Consumer Tech To Bulk Up Enterprise Cloud Services

Google shined a spotlight on the unique technical attributes of its data center infrastructure on Thursday as a selling point for its enterprise cloud.

The internet giant is leveraging the resources built out for its massive consumer base to deliver to business customers unprecedented reliability and security, Urs Holzle, Google's senior vice president of technical infrastructure, said in a keynote at the Google Cloud Next 2017 conference.

Holzle and other Google execs introduced several new cloud features that extend capabilities once exclusive to Google services to GCP customers, including a highly scalable database synced by atomic clocks.

[Related: Google Cloud Chief Makes The Case That The Consumer Search Giant Is Also An Enterprise Powerhouse]

Sponsored post

Google's been "operating a hyperscale cloud for a very long time," Holzle said.

The Mountain View, Calif.-based company, will add three GCP regions by next year, he announced – in the Netherlands, Canada and California. Those facilities, and several coming online in the next few months, will bring Google to 17 total cloud regions around the globe.

Those data centers "are designed from first principles," Holzle said, and funded to the tune of $29.4 billion in capital expenditures over the last three years.

Nine years ago, Google became the first company that wasn't a telecom to build an undersea cable. Since that connection from the U.S. to Japan debuted, Google has built out more submarine fiber capacity for a highly redundant backbone that delivers minimum latency and in most cases hands network traffic directly to a local ISP.

"As a GCP customer, you benefit from this," Holzle said.

A new product that capitalizes on those attributes is Cloud Spanner, a globally distributed database service which offers the consistency of SQL with the horizontal scaling capacity of NoSQL, said Greg DeMichillie, director of product management for GCP.

It’s the same database that powers Google's AdWords and other services. "We've been running our largest production systems on this for years," Demichillie said.

Atomic clocks installed at every Google data center serve up the highly accurate timestamps that make the service possible, he said.

Holzle shared other ways that Google is beefing up its public cloud, spanning compute, pricing models, security features and developer services.

On the compute side, GCP virtual machines can now be provisioned with 64 cores.

"Later this year you'll see even higher core counts, and memory sizes of a terabyte or more," Holzle said.

Google is also the first cloud to offer Intel's next-generation Skylake processors through a strategic alliance with the chipmaker.

Raejeanne Skillern, vice president of Intel's data center group, said Google had optimized six generations of Intel's Xeon processors for its unique environment.

The companies are collaborating on hybrid cloud orchestration, security and machine learning, she said after joining Holzle on stage. Intel is also investing in open source technologies first developed at Google like TensorFlow and Kubernetes and jointly validating custom SKUs for Google's data centers.

To help users minimize costs, Holzle introduced a new Committed Use discount. That pricing schema gives customers the option of entering one- or three-year contracts.

"Some cloud providers force you to pay up front for three years to get the best price. But how is that better than just buying a new server?" he asked, referring tacitly to Amazon Web Services Reserved Instances.

But Holzle made the cast that Google's Committed instances don't involve long-term commitments that will strand resources because they allow the flexibility of changing machine types and sizes.

Simon Margolis, director of cloud platform at Google partner SADA Systems, said customers have been asking for that kind of a pricing option for a long time.

"This really incentivizes customers to commit to Google's platform for innovation, leveraging Google's technology for the enterprise and being rewarded for their loyalty," Margolis told CRN.

There is an important difference between Amazon's Reserved Instances and what Google is doing, Margolis said.

"You're committing financially, but nowhere infrastructure is concerned, and that's where you need to stay flexible to stay relative," he said.

Google Cloud delivers "the same security infrastructure that Google uses," Holzle said in shifting the topic to security.

That starts with hundreds of guards at the data centers. Then there's the Titan chip that's installed on all new Google machines to help validate services and prevent tampering down to the level of the BIOS.

Data is also encrypted in storage, and when sent over networks.

But a system is only secure as its user accounts, and phishing is the top security problem faced by enterprises.

For that reason, Google is releasing several new security features for GCP and its G Suite application portfolio.

Data Loss Prevention is a new API that automatically detects and redacts sensitive information in text and images. It can minimize compliance headaches by making sure companies don’t have on-hand information they'd rather not, like social security numbers of credit card information.

Google's Cloud Key Management services are also generally available now, he said, as are identity management and security key enforcement features.

Brian Stevens, vice president of GCP, shifted gears to investments Google is making in supporting Microsoft products.

Google wants "GCP to be as great for Windows developers as Linux and open source developers," Steven said. "It's really important to meet developers, and Windows developers, where they are."

Google made generally available SQL Server Enterprise, and announced a partner program involving Windows specialists with GCP expertise.

Another area of concentration that's becoming increasingly popular among cloud users is serverless computing, a design approach Google has long employed across many of its services.

Google released in beta a feature called Cloud Functions, a platform for building event-based, microservices that can be used to personalize cloud services.

"Functions are just fragments of code," Stevens said, and "what developers do with them is connect services together."

Vladimir Stoyak, principal consultant for Google partner Pythian's big data practice, told CRN that Cloud Functions would "greatly simplify processes involved through data ingestion, transformation, and data load pipelines."

Google will also be making its App Engine development platform more versatile. App Engine Flexible Environment allows developers to bring their own runtime or framework to the platform—as long as it can run in a Docker container—complimenting the seven languages supported out of the box.

On the data services side, a new BigQuery data transfer service integrates with Google's cloud-based SaaS applications for marketers, like Adwords, DoubleClick Campaign Manager and YouTube Analytics.

Another new feature called Cloud Dataprep allows users to explore and clean data before pulling it into the BigQuery data warehouse for analysis.