Talking About Trends

GreenPages Technology Solutions CTO Chris Ward addressed the major trends, opportunities and challenges he is seeing in the data center and cloud markets today in a keynote that kicked off the 2014 CloudScape Summit in Portsmouth, N.H. From network virtualization and falling cloud prices to solid state drives, Ward said there are all sorts of changes happening in the market for IT professionals to look at and capitalize on. Take a look at 20 of the trends that he sees happening in the market today for data center and cloud along with his tips for success amid the changes.

Network Virtualization And Policy

Network virtualization helps add intelligence to the network and abstract it from the physical hardware. What that allows clients to do, Ward said, is program the network to prioritize a specific application that is more important at that point in time and optimize traffic to that application, with the flexibility to shift that focus to another app as needs change. That puts network virtualization and management as a single cohesive unit, regardless of the hardware it resides on.

"Things that we never would normally think of doing because of manpower and legwork needed to do it, network virtualization and policy will allow us to do that," Ward said.

There is a "street war" going on in the industry around network virtualization, he said, with Cisco and VMware battling it out for the top spot. However, Ward said the two vendors are "apples and oranges" when it comes to network virtualization, so it will ultimately come down to the client and which solution or solutions best fit their needs.

40 and 100Gb Is Becoming Mainstream

Ward joked that when he stood on the same stage at GreenPages' 2008 conference and asked attendees how many of them had adopted 10Gb networking in their data centers, not many hands came up. At that time, he said, it was impossible to imagine being able to fill that pipeline and even need more. However, today that is a very different story, he said.

"What we're seeing today is you absolutely are able to fill that pipe in a pretty massive way," Ward said. "... if you're architecting a data center or new environments, it's something to keep an eye on. It's certainly there and ready today."

Active/Active Datacenters

The idea behind an active/active datacenter is for clients with multiple data centers on a single campus or across the globe to optimize the potential of those data centers, instead of having one sitting as a backup in case of a disastrous event. The solution provider had theorized about engagements such as this in the past, but had never put it into practice, Ward said. With today's falling costs, he said, it is a possibility for a lot more companies. So he expects, and already sees, more clients taking steps to get the efficiency of all of its data centers.

"Unlike in the past, it's not necessarily a seven figure engagement anymore. We can do this at a reasonable cost as well. It's not something just for the Future 100, like in the past," Ward said.

Solid State Is Everywhere

While most companies used to use solid state drives for specific use cases that required high performance, Ward said he is beginning to see solid state drives picking up more traction in other areas. The difference, he said, is that before cost was prohibitive. Today, companies can put solid state drives into their storage for about the price you would have paid for cheaper spinning discs a year ago. Ward said some key vendors to pay attention to as this trend evolves are EMC, SolidFire, Nimble and Pure Storage.

Software Defined Storage

While Ward said he could argue that Software Defined Storage has been around for some time, he said that it is gaining more prominence as a way to put more intelligence behind storage. While cloud vendors can guarantee factors such as CPU and memory for their clients, cloud solutions were not architected from a SAN perspective. Now, he said, there is the technology to make that happen, just as it has in networking with software defined networking.

"Just like with the networking, we're able to make the storage arrays and backend storage more intelligent," Ward said.

Consumption Continues To Expand

As trends such as big data, social and mobile continue to expand, they are driving data consumption at a massive level. It isn't about gigabytes of data, or even petabytes, he said. Now companies are talking in exabytes of capacity, which is 1 billion GBs. For example, Ward cited numbers from EMC, where the storage vendor reported taking from 1991 to 2005 to ship its first exabyte of data. In 2013, EMC for the first time shipped more than an exabyte of storage in less than a month.

"This is the definition of an exponential curve in storage growth," Ward said. "The question becomes, what are you going to do with all of this?" Ward said.

Now that companies have all of this data at their fingertips, he said, they have to decide whether they want it to reside on an internal data center or in the cloud.

Servers Getting More Horsepower

Ward said that servers are getting more and more horsepower behind them. Citing GreenPages own numbers, he said that in 2009, the solution provider on average sold a server with 4 to 6 cores per socket with 96GBs of RAM, with a maximum of 192GBs. Now, the company sells servers with 10 to 12 cores per socket, with 256GBs of RAM, with a max of 768GBs.

To take advantage of this trend, Ward advises that companies take a look at their refresh cycles and consider upgrading from a five-year refresh cycle to a three-year cycle. When licensing is based on the number of sockets, he said that companies might be able to cut costs or even pay for a full refresh in savings by cutting the number of sockets in half with a refresh.

"It can make a lot of sense to keep up to date with the latest and greatest on servers," Ward said.

Converged/Hyper-Converged Infrastructure

With converged infrastructure, Ward said that solution providers and clients can skip over the step of using a best practices "cookbook" to match up servers and networking equipment. Instead, it comes out of the box ready to go, he said.

"Your time to market or your time to value of the investment shrinks quite drastically," he said.

Hyper-converged infrastructure simplifies that even further, he said, by taking the challenge of multiple vendors out of the equation. Servers can be tied together to build virtual storage, without the need for a SAN. Ward said that a hyper-converged infrastructure solution can make a lot of sense in many cases, especially if scaling linearly.

Physical Or Virtual?

The answer to whether a company should have physical or virtual data centers, Ward said, is that it depends on the company. There are cases where a physical environment can be better than a virtual one, such as big data and Hadoop workloads with lots of memory and compute power requirements. However, he said that new technology, such as HP's Moonshot, is helping create a truly physical desktop -- but hosted in the data center, which he said is cutting out the slight performance penalty paid by virtualization.

The Need For Visibility And Control

The way that networking and storage become more intelligent is through understanding what is going on though visibility and then optimizing the traffic and environment from there, Ward said. For example, Ward said with the rise of DevOps, IT needs the visibility and control to provide internal developers with the tools, controls and flexibility they need to drive robust solutions. A lot of internal environments aren't built for speed, he said, and instead focuse more on things like resiliency. However, if internal IT can automate and build capabilities to mimic the flexibility that developers can get by going to Amazon, they can keep the work internal. That comes down to visibility and control, he said.

Where Are You Spending Your Time?

Ward said that studies have found that IT professionals spend 70 to 80 percent of the time simply keeping on the lights.

"Is that where you want to spend your time? Or is your time better spent on more strategic initiatives?" Ward asked the audience. Instead of just keeping on the lights, Ward said that IT professionals should be spending their time focusing on making the business more successful and driving more revenue for the company. IT professionals can find more time for that by outsourcing some of their daily "mundane" tasks to solution providers or through using tools that help manage the process more efficiently.

Keeping Skills Up To Date

Ward said that IT leaders should take a look at the people spending their days patching and maintaining systems and ask if that is the best place for their skills. He said that organizations should look into training those employees on cloud architectures, management and orchestration. Using tools such as automation, organizations can give those employees the flexibility they need to become IT innovators instead of simply keeping the lights on.

Mobility A Growing Priority

Even though less than half of the hands in the room went up when Ward asked which companies have a mobility strategy, he said it is a growing factor for IT professionals today. From BYOD and mobile device management to mobile application management, IT professionals have their hands full figuring out which control and enablement measures they need to put in place. Depending on their situation, companies might need a combination of tools, or even none at all.

"There's a lot of new technology out there and there can be a lot of confusion as well," Ward said.


When Ward asked which audience members' companies had moved to a 100 percent virtualized environment, very few hands went up. While they might not be ready or interested in making the full jump, Ward said IT should make sure to take a look at Desktop-as-a-Service as an option. For clients that are working to move more of their applications into Software-as-a-Service, Ward said that Desktop-as-a-Service is a good option for those clients instead of VDI, as it is much less expensive despite a storage bottleneck for VDI slowly disappearing.

"It's something to consider, especially if you're considering moving other applications into the cloud," Ward said.

Cloud Prices Dropping

Prices for public cloud are continuing to fall, with Amazon alone having 42 price reductions since 2008 and others, such as Google and Microsoft Azure, following suit. However, Ward warned that just because public cloud prices are dropping, doesn't mean that it's actually cheaper than the environment that clients already have. He said that most customers don't actually have the data on hand for how much their environments cost, which he compared to flying with blinders on. He said clients should make sure that they have a point of comparison before jumping into the cloud, because in some cases it could end up being more costly than their current environment.

Cloud Adoption Spreading

With new Software-as-a-Service solutions emerging and expanding their capabilities, Ward said that he is starting to see the customer base for cloud solutions growing. Verticals such as financial and healthcare, which have historically turned down the cloud for security and regulatory reasons are starting to evaluate the cloud as a potential option or even making the jump, he said. For example, FINRA, an independent financial regulatory group, recently adopted Amazon EC2.

"A lot of the security and regulatory concerns are starting to fade away there," Ward said.

Hybrid Cloud Is Obvious Winner

In an extremely competitive cloud market, Ward said that hybrid cloud is emerging as the clear winner for the majority of clients.

"The odds that you are going to be 100 percent cloud are going to be very slim, but the odds that you are going to have something in the cloud are very, very high," Ward said.

For that reason, Ward said it is important that solution providers and IT professionals embrace both public and private cloud and offer a hybrid solution as part of their portfolio.

Cloud Enabling Split of Control And Data

Data and management don't have to be housed under the same roof in the cloud, Ward said. He is seeing this trend across several companies and applications today, where companies are taking advantage of the management already built in the public cloud to manage data in the private cloud.

Embracing Shadow IT

While Shadow IT has long been seen as a negative for IT professionals, Wade is seeing a shift to embracing Shadow IT and said it's actually a good thing. That's because IT professionals can use Shadow IT to their advantage, he said. Before that line of business, employees would come to IT with a problem they wanted solved, and then IT would spend hours, days or weeks finding an appropriate solution. With Shadow IT enabled, he said employees can go out and find the solutions they want, vet them and design them as they need, and then approach the CIO with a problem and a proposed solution to it. That cuts out a lot of the leg work for CIOs, who can then focus on the integration of the solution into the existing environment.

New Ways To Adopt Cloud

There are a lot more ways for customers to move into the cloud nowadays, Ward said. Client options before were Infrastructure-as-a-Service, Software-as-a-Service or Platform-as-a-Service. Now they have what he called "X-as-a-Service." This model helps solve a specific problem or use case using the as-a-Service model, such as Backup-as-a-Service or Disaster Recovery-as-a-Service. The benefit for clients of this model, Ward said, is that it helps align them with budgeting and they can implement it without having to purchase a general infrastructure to deploy the solution on top of, driving a much more meaningful consumption.