You're reading an older article from ELECTRICAL CONTRACTOR. Some content, such as code-related information, may be outdated. Visit our homepage to view the most up-to-date articles.
Funded by AMD and peer-reviewed by major companies that sell servers, the report estimates servers accounted for 1.2 percent of all power consumption in the United States and .8 percent worldwide in 2005. This is roughly the equivalent to the power consumption of the state of Mississippi. It also found in 2005, total data center electricity consumption in the United States, including servers, cooling and auxiliary equipment, was approximately 45 billion kilowatt-hours (kWh), resulting in total utility bills amounting to $2.7 billion, with total data center power and electricity consumption for the world estimated to cost $7.2 billion annually.
“I expected some growth, but not quite as large,” said Jonathan Koomey, Ph.D., staff scientist, Lawrence Berkeley National Laboratory and author of the report.
The report is based on data from IDC, a Massachusetts-based research firm, which detailed the number of pre-existing installed servers and shipments of servers and estimated the power each server used by class. Before this report, Koomey said, the industry had no up-to-date or reliable estimate of server power consumptions.
Companies such as AMD and Intel, however, have been treating power consumption as a priority before Koomey’s report was available, developing technology to reduce the energy consumption of their products. HP also has been conducting similar research with similar results to Koomey’s.
“The report is important to the industry,” Koomey said. “Because once power consumption can be quantified, companies can make better decisions about how to reduce it and save money.”
The full report can be accessed at http://enterprise.amd.com. EC