Businesses, especially in the financial community, are increasingly reliant on information. This has sent the demand to process and store digital information skyrocketing, turning the heat up on data centers’ computer resources.

Higher speed and larger storage capacity means more heat, expanding the energy consumption for their servers and the power and cooling systems that support them. In fact, in most data centers, the heat generated by the cooling system is equal to that produced by the computer equipment.

In a recent study on data centers, Jonathan Koomey, staff scientist at Lawrence Berkeley National Laboratory and consulting professor at Stanford University, found that electricity use linked to servers doubled from the year 2000 to 2005. According to the study, by 2005, there were approximately 10.3 million servers in U.S. data centers. As the power densities in the server racks approach the limits of the air-handling systems, companies are applying new technologies to beat the heat.

More servers mean more heat and more energy consumed by data centers— 61 billion kilowatt-hours in 2006—more than 40 times the energy of equally sized office spaces according to a 2007 Environmental Protection Agency (EPA) report. In 2006, our nation’s data centers spent $4.5 billion on electricity. Researchers believe energy costs could reach $7.4 billion by 2011. But the EPA estimates these data centers could potentially save approximately $4 billion in power costs each year by implementing more energy-efficient equipment and systems.

The environment in data centers must be carefully controlled to keep the massive amounts of heat in check and prevent system malfunction. Increased heat levels result in shortening of equipment life and increased component failures, especially with the increasingly complex central processing unit (CPU) chips. Decreased humidity can result in static electricity buildup. According to the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) Thermal Guidelines for Data Processing Environments, the recommended temperature range for data centers is 68–75°F with a humidity range of 40–55 percent. Fans and massive chiller systems extract the server heat and maintain humidity levels. To get optimal results from air cooling systems, heating, ventilating and air conditioning (HVAC) equipment benefits from routine maintenance.

There are numerous ways to reduce heat and energy consumption. On the equipment side, servers can use low-watts-per-performance-metric CPUs and uninteruptable power supply systems with efficiencies (typically 85 percent) improving to nearly 95 percent. Direct current distribution systems reduce losses by reducing the number of AC-DC-AC conversions, and more precise application of cooling systems improves their efficiency. Though mixing water and electricity may seem counterintuitive, new liquid-cooling technologies involve everything from water jackets that fit around servers to more radical systems that squirt specially treated nonconductive, noncorrosive water right onto the electrical parts themselves, according to InformationWeek. Motion-sensor lights reduce heat and power costs. The less energy used throughout, the easier it is to keep these data centers cool.

Power management systems with energy management software allow facility managers to raise thermostat set points in areas where appropriate. This provides managers with real-time monitoring of system performance and energy management capabilities, so they can quickly identify inefficiencies in their operations and correct them.

Some companies are using airside and waterside economizers to keep their data centers cool. Airside economizers normally include a sensor and extensive filtering that allows outside air to enter the data center when environmental conditions are appropriate for venting heated air and allowing cool air into the space. Waterside economizers provide cooling through evaporation. They use heat exchangers to lower temperatures and are very effective where cooling loads are high and there is little air flow. The use of this technology reduces the risk of importing airborne pollutants while lowering energy costs.

Use of alternative energy sources is another option. At Google’s Mountain View, Calif., headquarters, 1.6 megawatts of solar panels generate one-third of its electricity. Although its servers aren’t solar-powered, Google is considering that option. Sun Microsystems in Japan is constructing a large data center 100 meters underground in old mines. The rooms in this center remain at 15°C and use naturally cold groundwater to cool their servers. A Swiss company funnels heat out of the data center to warm a swimming pool next door.

Facility managers now have many options in alternative cooling and energy technologies to beat the heat and save on energy costs. As for those who install and maintain such systems, the heat gets turned up to stay current with new technologies and with the new requirements in the 2008 National Electrical Code, such as Article 708 on Critical Operations Power Systems.

BINGHAM, a contributing editor for power quality, can be reached at 732.287.3680.