The data center industry is a major consumer of electricity, but new technologies and design concepts are being employed by data center owners and operators to decrease the cost of powering and cooling those structures.
The amount of power consumed by data centers in the U.S. and around the world continues to grow, but not as fast as previously estimated, according to a recent study sponsored by the New York Times.
According to the study, total data center power consumption from servers, storage, communications, cooling, and power distribution equipment accounted for between 1.7 percent and 2.2 percent of total electricity use in the U.S. in 2010.
This was up from 0.8 percent of total U.S. power consumption in 2000 and 1.5 percent in 2005. However, it is down significantly from the 3.5 percent previously expected, based on historical trends.
The less-than-expected data center power consumption growth stems from a leveling off in the server installed base, and not from operational improvements or new technologies. Going forward, the server installed base is not expected to grow, according to the study.
More research is needed to understand the impact on data center power consumption from increasing storage capacity, the adoption of cloud computing, higher server processing power, and the percentage of servers which are powered on but not being used, the study's author said.
However, all these factors are important considerations when designing and operating a data center.
John Snider, CEO of NOVA, an Albuquerque, N.M. operator of data centers for the U.S. Department of Defense, said that increased virtualization and cloud computing, while decreasing the number of servers needed for a given operation, can still actually lead to increased power consumption as customers will look to put more processing and storage capabilities into smaller spaces.
"We're seeing exponentially higher computing power as servers get more dense without making them more efficient," he said. "So consumption is going up."
Philip Fischer, data center business development manager at APC by Schneider Electric, said that while increased virtualization can cut back on the number of servers needed, it can also lead to lower data center efficiency, he said.
"UPS and cooling equipment have the greatest efficiency when working at full load," he said. "If you decrease the load, efficiency may drop 5, 10, or 15 percent. But it's still a good thing, as total power use is reduced. So this begs the question: Is decreasing efficiency always a bad thing?"
Transitions In Data Center Design
A properly-designed data center in which power consumption factors are addressed from the beginning has the biggest impact on the cost of running a data center.
How well data center is designed depends heavily on its power efficiency as measured by PUE, or Power Usage Effectiveness. PUE is the ratio of total power used by a data center compared to the power used to run the IT equipment. A PUE of 2.0 means that, for every kWh of power used by IT equipment, another kWh is needed to run the data center infrastructure.
Improved designs mean that newer data centers in general are much more energy efficient than older ones, said Dave Leonard, senior vice president of data center operations at ViaWest, Denver, which operates 22 data centers and rents data center space to customers.
"Older data centers we acquire have a PUE of about 2.0," he said. "But newer-designed building PUE falls to about 1.3 over time. So there's a big difference between different building generations."
Next: Taking Advantage Of New Designs
That difference can be minimized with a redesign, Leonard said. "We're now studying one of our building's airflow to see whether we can shut off up to 11 of the 56 cooling systems. Depending on the cost and the amount of power company rebates, and the final bill, our initial estimate is that the investment involved is justified by a three-year to five-year payback."
The other aspect of design is working with customers before they move into a data center, said Mike Duckett, president of CoreLink Data Centers, a Mount Prospect, Ill. owner and operator of five data centers.
When customers work closely with data center providers on layouts, it is much easier to be efficient, he said. "The more we understand a customer's plans, the easier it is for us over time to help guide them from space, cooling, and green perspectives as new technologies become available," he said.
With all the possible technologies, the key to designing a data center is to make sure flexibility is built in from the start, said Edward Henigin, CTO of Data Foundry, an Austin, Texas-based data center owner.
"You never know what customers will need," he said. "Retail shops to life-saving organizations to airlines, everyone is a little different. They may come in with IBM gear and no cabinet, or with HP gear in a non-HP cabinet, or use our cabinets. So we have to accommodate anything."
That "anything" might be one cabinet pulling 2kWh sitting next to one pulling 15kWh, requiring "chimneys" around the racks to move hot air directly into the ceiling air flow, Henigin said. "It's pretty neat when you don't know what's coming in the door next," he said.
On the equipment side, vendors are doing more to help data center owners and solution providers reduce power consumption.
APC by Schneider Electric is offering solution providers modular integrated power systems for the facilities and the equipment, making it possible for them to develop repeatable solutions, Fischer said.
The company is also providing tools to manage a data center's entire power consumption, along with tools to simulate conditions before and after proposed changes. "We help partners answer the 'what if' questions," he said.
Chris Loeffler, program manager at Eaton, said that several new power-saving technologies going into Eaton's own data centers are starting to become available to partners.
For instance, Eaton is now deploying 400/230-volt power, which is popular in Europe and Asia, because it is more efficient that the standard 480/208-volt power traditionally used in the U.S., Loeffler said. "The problem is, there's a lot of legacy equipment," he said. "If it's a greenfield installation, customers can deploy 400/230-volt. Cloud infrastructure providers are starting to deploy it."
Eaton is also increasing the intelligence of software to monitor power use from the generator to the cooling systems, providing the ability to adjust system performance when possible to reduce air conditioning loads, Loeffler said.
The company has also added the ability to meter actual power use down to the individual outlets on UPS and power distribution units, which lets customers bill customers according to power use. "They can use the data to force customers and internal customers to recognize that power isn't free," he said."
Several new technologies are being employed as new data centers are built and older ones remodeled, but their effectiveness is often subject to debate.
Next: Better Ways To Keep Cool
One new technique for improving data center cooling efficiency is eliminating the raised floor space which has been used for decades to bring cool air to equipment racks, and replacing it with overhead cooling.
Snider said this is a move whose time has arrived. "Think about it--heat rises," Snider said. "So why push cold air through the floor?"
Snider said he believes that, with the right education, raised floors will no longer be designed into data centers within five years. "The problem is, people like me who know networking are doing the designs, not the experts on the mechanical side," he said.
If one only thinks that cool air sinks and hot air rises, moving cool air come from overhead instead through the floor might seem reasonable, Henigin said. However, the problem with that concept is that cool air mixes with hot air rather than sinking en masse, he said.
"We like raised floors because they provide fine-grained control of cold air delivery," he said. "We use perforated floor tiles to provide 15 percent to 56 percent air flow, and put dampers in the tiles to control the amount of cold air released. You can't do that kind of control with overhead air."
Data centers are also starting to experiment with ways cut back on air conditioner use by running IT equipment at higher temperatures than in the past.
Traditional data centers have focused on keeping air temperatures at around 68 degrees Fahrenheit, but operators have been trying to push that temperature to 70 degrees or 75 degrees.
They have been helped in part by equipment vendors who are providing more leeway in terms of what temperatures are suitable for their equipment. Dell, for instance, was the first vendor to say its servers can operate in temperatures of up to 113 degrees Fahrenheit.
ASHRAE, the American Society of Heating, Refrigerating and Air-Conditioning Engineers, has said that inlet temperatures can now safely be as high as 80 degrees, Leonard said. However, making such as change is difficult in a multi-tenant environment.
"If it's an enterprise data center serving one customer, you only have to convince one guy that it's more energy efficient to run at 78 degrees than at 70 degrees," he said. "But we deal with multiple customers. We can't make the change so fast."
Raising temperatures could have negative impacts if not handled correctly, Loeffler said. For instance, small UPS units with lead acid batteries will work in higher temperatures, but the batteries may need to be replaced more often. "Bigger data centers have larger systems with batteries placed in separate rooms, making it easier to keep them cool," he said.
Another new technology making it into data centers is free cooling, or bringing cool outside air into the data center to replace part of the air conditioned air.
Snider, whose company is building a new data center in Albuquerque, N.M., said that location is ideal for free cooling because of its 5,000-foot elevation. "Many parts of the country can use free cooling," he said. "Companies can filter the air and get free cooling eight months out of the year."
CoreLink is currently implementing free cooling in its data centers to cut operating costs by 30 percent to 40 percent per year and be more price competitive, Duckett said. "Technology solutions are needed for cooling, but we can always count on Fall, Winter, and Spring," he said.
Next: What About Going Green?
Data center owners and providers cited other technologies to help cut power consumption.
Snider said new technologies let customers dial down their server processing rates, and turning them off completely when customers need less performance. "If you turn down the power to 60 percent, that a 40-percent energy saving," he said. "Software combined with monitoring tools can be used to dial back performance."
Jim Wolford, owner and CEO of Atomic Data, a Minneapolis provider of data center services from facilities leased from large carriers, said that for some of his customers, dropping a simple item like a heavy plastic supermarket backroom divider into a cold aisle to separate hot and cold air can cut air conditioning costs by 7 percent.
To Go "Green" Or Not
While reducing power consumption helps cut a company's carbon footprint, data center customers may or may not feel a need to present a "green" face to the public.
For a big company like Google, which according to the New York Times study uses about 1.9 billion kWh, or about 0.8 percent of all the world's data center power consumption, in its own data centers, being seen as "green" is important.
Greenpeace, in a reported it issued in April, wrote that Google is among the top two users of clean, alternative energy, including wind power, solar energy, hydropower, bioenergy, geothermal power, and marine or ocean wave power, in its data centers.
But for smaller companies, the focus is on any activities which can cut their own costs and not on helping data centers be "green," Duckett said.
"SMBs rarely if ever talk 'green,'" he said. "But Fortune-500 companies typically like to work with providers who meet their corporate goals. Most of them have 'green' initiatives. They look at data center operations, and ask what we are doing to stay efficient."
Wolford said customers' shifts towards virtualization and away from colocating hardware means that his company's focus is moving away from helping customers save power to cutting his own power consumption.
"We look at the greening of the data center, but for us it's an internal issue," he said. "We are looking at controlling our costs, not at helping customers control costs. The customers in this case don't care about power consumption as much. We do."