Page 1 of 2
The amount of power consumed by data centers in the U.S. and around the world continues to grow, but not as fast as previously estimated, according to a new study by Analytics Press.
The study, which was written by Jonathan Koomey, a consulting professor at Stanford University, and sponsored by the New York Times, found that a slowing in the installed base of servers due to virtualization and the 2008 economic downturn more than made up for the increased power consumption per server over the last few years.
The study also estimated that one server user in particular, Google, alone accounted for an estimated 0.8 percent of all data center power consumption worldwide and 0.011 percent of the world's total power consumption.
Koomey, who did a similar study in 2007, took advantage of new server installed base and server sales estimates from analyst firm IDC to revise earlier projections about data center power consumption downward.
In his report on the study, Koomey said total data center power consumption from servers, storage, communications, cooling, and power distribution equipment accounts for between 1.7 percent and 2.2 percent of total electricity use in the U.S. in 2010.
This is up from 0.8 percent of total U.S. power consumption in 2000 and 1.5 percent in 2005. However, it is down significantly from the 3.5 percent of total U.S. power consumption previously estimated based on continuing historical trends, and the previous estimate of 2.8 percent assuming that power saving technologies would be adopted.
The 2007 predictions of U.S. data center power consumption were based on a report that year from the Environmental Protection Agency. The lower range of the EPA's projected power consumption assumed that increased virtualization and increased use of technology to cut server power consumption would account for the difference, Koomey wrote.
Worldwide data center power consumption trends were similar to those of the U.S. Koomey wrote that the world's data centers consumed an estimated 1.1 percent to 1.5 percent of all electricity used in 2010, up from 0.5 percent in 2000 and 1.0 percent in 2005, but down from the 1.7 percent to 2.2 percent previously estimated.
The key factor behind the less-than-expected data center power consumption lies in a slower growth in the server installed base than early projected, Koomey wrote.
Using IDC estimates about the server installed base and server sales, Koomey estimated that the total U.S. installed base of servers in 2010 was 11.5 million volume servers, 326,000 midrange servers, and 36,500 high-end servers in the U.S. That is significantly lower than projections from four years ago of an estimated 15.4 million volume servers, 326,000 midrange servers, and 15,200 high-end servers, Koomey wrote.
Koomey also said the 2007 report assumed a Power Usage Effectiveness (PUE) of 2.0, which means that for 1 kWh (kilowatt hour) of power used by a server, 1 kWh is needed to run the data center infrastructure for things like cooling. Koomey estimated that the average PUE in 2010 was between 1.83 and 1.92.
"The main reason for the lower estimates in this study is the much lower IDC installed base estimates, not the significant operational improvements and installed base reductions from virtualization assumed in that scenario," Koomey wrote. "Of course, some operational improvements are captured in this study's new data. . . but they are not as important as the installed base estimates to the results."
Koomey admitted that his study suffers from a couple of areas which require more study.
Next: The Impact Of Storage, Cloud, Virtualization, And Google