Get On Board: Docker's Channel Maturity Unlocks The Container Tech Opportunity

Nebulaworks CEO Chris Ciborowski knew from the moment he saw Docker’s application container engine three years ago that it would forever change enterprise IT economics.

Others thought the phenomenon would prove a fl ash in the pan, or that the technology would only be relevant to a subset of developers looking to shuttle software between computing environments. But Ciborowski was certain Docker would change the way enterprises consume IT in as explosive a manner as VMware had done more than a decade earlier with its server virtualization technology.

For starters, Docker offered customers the promise of reducing, or altogether avoiding, the so-called VMware tax—licensing fees that had become a staple for enterprises implementing virtualization to get a handle on server sprawl.

[Related: Container Tech: Get On Board!]

id
unit-1659132512259
type
Sponsored post

Many of those enterprises were looking to contain their virtual machine sprawl, and using the open-source, Linux-based technology to displace VMs could yield hundreds of thousands, if not millions, of dollars in savings per year, depending on the scale of the deployment.

’We felt [containers] were absolutely game-changing in the way people were going to be consuming technology,’ said Ciborowski. And Docker ’was going to be the de facto standard.’ The big bet on Docker is paying off handsomely for the Irvine, Calif., company. Nebulaworks’ business doubled this year, and Ciborowski expects growth to accelerate with broader market adoption.

It’s the kind of success that’s inviting comparisons to the advent of VMware’s pioneering technology, which fostered thriving new practices while driving others out of business.

In a recent engagement for a manufacturer, Nebulaworks leveraged application containers to reduce by a factor of 11 the number of VMs run on the customer’s public cloud provider (though some VM instances were larger), Ciborowski said.

Containers enable those order-of-magnitude efficiency gains—and dramatic reductions in cloud bills and virtualization licensing fees—by making it safe and secure to squeeze several production applications onto a single instance of an OS that can be running on bare metal or a virtual machine (versus running for each application a VM with its own OS).

A popular analogy is to think of VMs as houses, each with its own self-contained plumbing and electricity, and containers as apartments that use the shared infrastructure of the entire building.

ThoughtWorks, a solution provider with more than two decades in business and a global practice, also rushed to gain Docker proficiency upon encountering the technology.

’I don’t think people can sit on their hands right now, given the amount of mainstream industry attention on containers,’ said Mike Mason, a technology activist in the Office of the CTO of the Chicago-based company.

ThoughtWorks is fielding almost three times more demand for Docker container implementations than it can actually service, and is working to scale its practice accordingly, he said. ’We know we need to have the expertise. All signs point to it being hugely important, and even if it does get superseded, it will be here for five-plus years,’ Mason told CRN.

The CEO of one VMware service provider, who did not want to be identified, told CRN Docker proficiency may well be the hottest skill set on the planet.

’If you are a solution provider with Docker expertise, you can reduce the licensing cost for an enterprise by three or four times or more,’ he said. ’VMware created the problem with their VMware licensing tax. If a partner can allow a company to pay for one VM rather than five VMs by using a container, that is a big savings. Setting up these containers requires a good amount of expertise and value that the partner is providing.’

While forward-looking solution providers were drawn to Docker’s open-source container engine because of its ability to deliver the holy grail of enterprise computing—application isolation—with unprecedented benefits in resource utilization, portability, deployment speed and fault tolerance, in its early days, they were mostly on their own in figuring out monetization strategies.

But the startup’s channel maturity is now catching up with its eponymous product’s market penetration. In September, Docker introduced a two-tier channel program, formalizing its relationship with many of those early partners around its flagship commercial product.

’Our sales motion is heavily channel-led. We want to be channel-driven,’ Docker CEO Ben Golub told CRN. ’Beyond the potential for building, integrating and reselling, Docker tends to enable a great conversation and opportunities for partners around change management, infrastructure management, hybrid cloud.’

Shannon Williams, co-founder of Cupertino, Calif.-based Rancher Labs, developer of an application container management platform that also recently launched a channel program, told CRN he’s never actually tried to pitch a solution provider on the technology.

’The smart ones are banging on our door. They’re discovering us and coming to us, and we’re reactively coming up with plans and programs,’ Williams said. ’If you’re a VAR and not looking at this stuff, I think you’re really risking a lot.’

Major Players Making Investments

Docker 1.0, the startup’s first enterprise-grade release, had barely hit the market when just about every cloud operator — including Microsoft Azure, Amazon Web Services, IBM SoftLayer and Google Cloud Platform—started introducing integrations and services around the technology. ’If you look at the investments that major players in the cloud space are making, it is clear that container tech is very important,’ ThoughtWorks’ Mason said. Legacy IT providers are moving quickly to embrace the container phenomenon.

Hewlett Packard Enterprise now ships all of its servers bundled with Docker Engine and support. Microsoft’s Windows Server 2016 offers native Docker support. And Cisco Systems earlier this year acquired ContainerX, a container upstart, to help customers take advantage of the technology.

HPE CEO Meg Whitman, in an interview with CRN earlier this year, said the container phenomenon is part of the ’Darwinian’ nature of the IT business. Containers make VMware ’less relevant’—not next week, not next month, but over time, according to Whitman.

’As containers grow in importance I think the VMware asset will be not as strategic an asset, and it may actually shrink over time,’ she said.

But VMware is actually positioning itself to capitalize on Docker’s success. To that end, a broad refresh of its portfolio — vSphere, VSAN, vRealize Automation—now supports containerized workloads. And in October, VMware released Photon Platform, a cloud-building solution that supports Docker containers and delivers the Kubernetes container orchestration engine.

’Very simply, we see it actually not as a threat but as an opportunity,’ Paul Fazzone, general manager for cloud native apps atVMware, told CRN. Containers are just another type of workload, he said, that requires compute resources, networking connectivity, back-end storage, management and visibility—many of the same components that VMware enabled to make virtualization an enterprise-ready solution.

’We’re looking at how enterprises are going to leverage this technology, using our knowledge and history and expertise about enterprises taking modern applications into production,’ he said. ’We have a portfolio that we’re rapidly expanding capabilities for so they can be container-native solutions.’

The commitment of all those tech giants should be a sure sign that the channel, at the very least, must understand the implications of the technology to avoid being disrupted. ’If you’re a service provider or a VAR dealing with any element of the software stack, from delivering applications to reselling infrastructure capabilities, tools, storage, networking, data center solutions, containers need to be the number one thing you’re learning about,’ Williams said.

Golub said between 40 percent and 70 percent of all enterprises are now running Docker containers, to some degree, in production. But while the technology has gone from fringe to mainstream, Docker recognizes there’s still a lot of confusion in the channel about its products, the ecosystem, and the technology’s implications for legacy practices.

Demystifying Docker

In technical terms, containers are instances created when implementing a method called operating system-level virtualization. Docker’s core technology is an engine that creates a standardized, consistent, platform-independent environment — Docker likes the analogy of a shipping container — that encases a piece of software running on top of the Linux operating system. Having that container as a barrier between application and OS yields the ability for multiple applications to share the same Linux kernel while maintaining the critical attribute of full isolation that ensures they can’t blow each other up, or create mutual security vulnerabilities.

The concept is nothing new — the granddaddy of such technologies, the Unix system call chroot, was released in the early 1980s. Other implementations appeared over the years, from Solaris Zones to LXC, all striving to achieve secure application isolation sans hypervisor.

Virtualizing only the OS — rather than the entire stack — eliminates the need for a bulky VM to be created as a host for every individual application, said Avi Cavale, CEO of Shippable, a continuous deployment platform for Docker containers.

’When you create a VM, you’re creating redundant copies of the operating system on the physical machine,’ Cavale said. ’A lot of the overhead is spent just running those many operating systems.’

Containers drastically decrease the load on system resources to just what’s needed to run the application they encapsulate. For many of the same reasons containers are lightweight, they’re also extremely portable. An app packaged in a container should run on a developer’s laptop exactly as it does in production on bare metal or inside a virtual machine. That portability facilitates hybrid cloud architectures.

Some of the largest practitioners of distributed computing lean heavily on those attributes. Google runs all its online services in containers. And Samsung bought Joyent, a containernative cloud that will eventually host all its infrastructure.

But even though application containers have been around for decades, for most of that history the technology was nowhere near practical or easy to use.

’It was only being used in places like Google and Twitter that had highly specialized teams and tooling,’ said Golub. ’Docker democratized the technology.’

Docker’s re-introduction of Linux containers dovetailed with an emerging paradigm shift to microservices architectures and DevOps methodologies that demanded new technologies. Docker founder and CTO Solomon Hykes ’recognized it was important for developers as well as operators to have a common platform—the people building apps and running apps,’ Golub said.

In their hands, containers spurred development of a new generation of agile applications as well as the migration of legacy apps to the cloud, he said.

’I think we got our timing right,’ Golub said. ’The technology really enabled people to be more agile across multiple dimensions without having to invest massive amounts of time and money up front.’

For service provider NetEnrich, Docker’s potential was obvious as it began exploring use cases that Docker was solving, according to Manas Behera, senior director of engineering. Docker’s technology enabled NetEnrich to automate the deployment of applications ’with lightweight, portable, selfsufficient containers that can virtually run anywhere,’ he said. ’It became much easier to manage this humongous application with millions of people using it.’

Nebulaworks’ Ciborowski first glimpsed the potential of containers in 2005, when Sun released Solaris 10, which included the container implementation called Solaris Zones.

’For me, that pretty much changed the way I looked at how applications should be deployed in the data center,’ he said. ’Problem was developers never picked up on it. The Zones thing never really took off.’ Almost a decade later, Ciborowski was attending a Cloud Foundry convention but popped into another conference across the street called DockerCon. ’That day I said, ’This is Solaris Zones at the iTunes store. Trust me, this is going to be huge.’’

A New Ecosystem

Solution providers contemplating a dive into container tech often are overwhelmed and perplexed by a rapidly evolving ecosystem that spans well beyond Docker’s core open-source container standard.

’For such a new and small community, there sure is a lot of choice and a lot of confusion,’ Jeff Dickey, chief innovation officer at Redapt, a Docker systems integration partner based in Redmond, Wash., told CRN. ’You have to know the whole ecosystem to know who plays well together,’ he said. ’And there’s kind of a steep learning curve. It’s a small and tight ecosystem. Everybody knows each other.’

Docker’s technology partners have built solutions extending — and sometimes competing — with its own portfolio, realizing the founder’s vision of an expansive and open community. ’Solomon [Hykes] recognized it needed to be easier, portable, and you can’t do it as a single company and technology. You really need an ecosystem around it,’ Golub said.

The startups that hitched their wagons to Docker recognized containers could spur a revolution in application development, Shippable’s Cavale said.

’The reason why everybody partnered with them was because every single company, the way they did their business, was going to change,’ Cavale said.

Developers, IT administrators and solution providers were eager to flirt with Docker’s technology even before that ISV ecosystem matured, said Aater Suleman, CEO of Flux7, a Docker systems integration partner based in Austin, Texas. But scaling containerized infrastructure across disparate data center environments was unheard of in those early days, limiting the enterprise potential.

’At that time nobody knew how to manage a thousand containers,’ Suleman said.

A comprehensive stack of components would be necessary for deploying, managing and operating containers in production, from the OS, to networking, storage, and cluster orchestration, said Wei Dang, head of product at CoreOS, an early Docker proponent.

’Companies need to run infrastructure and apps at scale, and maintain the flexibility to iterate very quickly based on the demands of the business,’ Dang told CRN. ’Containers are just the first step to enable that transition.’

CoreOS began developing a lightweight Linux distribution optimized for running containerized apps and would later become a competitor, challenging Docker as the de facto standard by releasing Rocket, a container runtime focused on secure isolation and composability. While Rocket hasn’t done much to disturb Docker’s market share, the real competition between those companies, and several others, is happening up the stack.

To run apps at scale requires the ability to orchestrate interconnected groups of containers across clusters of server nodes. On that front, Docker released a native solution called Docker Swarm.

CoreOS, backed by Google as a major investor, countered with the first commercial version of Kubernetes, an orchestration engine Google developed internally to run the world’s largest container clusters, then open-sourced.

Other offerings are vying for that market, including Mesos, an open-source technology commercialized by Mesosphere that powers some massive container deployments at places like Twitter, eBay and Airbnb.

’Orchestration has matured to the point where people are comfortable with large-scale workloads. That’s why there’s a huge spike in container use,’ Suleman said, noting Flux7 has seen its Docker-based project revenue triple in the last year. The success of container orchestration engines like Swarm, Kubernetes and Mesos has again pushed the battlefield up the stack, where solutions demanded by enterprise customers lend themselves especially well to channel monetization.

’The reality is when you start getting into larger organizations, they are going to have hundreds, maybe thousands of clusters, over time. So that’s where another layer of technology gets developed, which is container management,’ Williams said. Rancher Labs’ open-source cluster management platform competes with Docker’s flagship commercial product, called Docker Datacenter. CoreOS sells Tectonic for managing Kubernetes clusters and other offerings are hitting the market, including services from the hyper-scale public cloud providers, like Amazon EC2 Container Service.

’Orchestration opened up the floodgates,’ Suleman said. ’And now the management layer is solving new problems.’

Disrupting The Data Center

Rancher Labs released the 1.0 version of its container management platform in March and launched a channel program in October — more than a year earlier than planned. Global systems integrators to small consultants to managed cloud service providers have already joined the program, all envisioning different benefits from the technology, Williams said. Containers are ’going to be as impactful to service providers as the two waves that preceded it: cloud computing and virtualization,’ he said.

Within five years, and likely sooner, Docker expects most new applications to be built and deployed using containers. ’The vast majority of data centers will be using Docker and containers as the way they run applications, as opposed to pure VMs,’ Golub said.

But even the most avid Docker boosters say containers and virtual machines don’t have to be mutually exclusive. Containers aren’t likely to sweep away VMs, Golub said, but they will replace many of the use cases that call for them, such as shuttling apps across hosting environments.

’You still have the flexibility to use whatever you want underneath the containers,’ said Redapt’s Dickey. ’You’re free to use VMware infrastructure, OpenStack, AWS and Google. You don’t have to retool your application. You develop it once and it runs anywhere.’

Shippable’s Cavale also doesn’t think containers will muscle VMs out of the market. But everyone involved in IT operations needs to consider the implications, and the evaluation window is shorter than it was the last time the data center was disrupted.

’If you follow the life cycle of virtual machines and how it was adopted, at what point they became ubiquitous and went into production, it was about seven or eight years,’ he said. ’It took VMs several years. It took Docker 18 months.’

Nebulaworks’ Ciborowski said his peers haven’t yet missed the boat. Those that want to catch up, however, should understand the money’s not in reselling licenses, but in consulting and managed services.

’I think we’re going to see a new generation of reseller,’ he said. ’Not only do they become public cloud brokers, but they are then leveraging relationships working with companies like ours that have consulting services.’

Docker is acutely aware of the importance of implementation partners and has turned its focus to crafting a channel-friendly business model, Golub said. The startup, to date, has just more than 20 resellers worldwide formally signed to its nascent program, including giants like Booz Allen, Accenture, HPE and IBM. But there are many more consultancies out there building solutions with the open-source product.

His message to solution providers: Ignore containers at your own peril. ’There’s a huge opportunity for the channel to build great businesses,’ he told CRN, ’or a great risk if they don’t embrace it.’