Maintenance and renovation services don’t get much more complex, challenging or fluid than in data centers. As cloud-based computing pervades, data centers are becoming more plentiful, the data they collect and manage is growing, and security needs are becoming more critical.
The concerns that make data center maintenance complicated also make it lucrative for electrical contractors. However, this complexity warrants education and training.
“This is a profession, and [contractors] have to keep up with literature, go to the meetings where these subjects are discussed,” said Robert McFarlane, data center planning specialist and principal at equipment and integrated systems planning company Shen Milsom & Wilke, New York.
He pointed to groups such as the Seattle-based Uptime Institute, an advisory organization for IT services standards, which offers the most recognized guidelines for data center reliability.
Volumes of information about data center design and management are available through the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) and its Technical Committee (TC) 9.9 for mission critical facilities, along with Chapter 19 of the ASHRAE handbook.
“The problem is they are written to be usable for everyone from novice data center managers to engineers,” McFarlane said.
This means readers have to wade through the details. Electronic versions have color illustrations, which can be more useful than the printed black-and-white editions.
As data center operators attempt to use existing space more efficiently, contractors are well served to acquaint themselves with a few buzz words they’ll hear more often, McFarlane said. One such set of terms is the “hot aisle” versus “cold aisle” containment.
Containment refers to the physical barriers that can allow air supplies to pool inside an aisle. Cold-aisle containment typically features doors on the ends and a partition or roof over the cold aisle.
Hot-aisle containment includes doors on the ends of the hot aisle and a configuration of baffles and duct work from the hot aisle to the returns of the cooling units.
A 2014 survey by the Uptime Institute found that 80 percent of data centers have implemented either hot-aisle or cold-aisle containment or both. However, there is no single answer to the question of whether it is better to contain the hot air in the hot aisle or the cold air in the cold aisle. At least one study, conducted by Intel and T-Systems in 2011, found that there was no significant benefit of one over the other. In either case, the containment reduces the need for cooling the areas that need it.
Close-coupled cooling aims to bring heat transfer near its source: the equipment rack. Moving the air conditioner closer to the equipment rack ensures a more precise delivery of inlet air and a more immediate capture of exhaust air.
Data center infrastructure management (DCIM) is another increasingly common term. It is the monitoring measurement of efficiency. Agencies and data centers are using DCIM measurements to determine the renovations and maintenance that are needed to ensure a center meets required efficiency levels.
Altogether, the DCIM measurements and recommendations will provide a contractor with a good understanding of the requirements for cooling, lighting, security and fire control in a data center and how they can ensure the center meets the required limits. For the data center manager, efficiency is a matter of meeting requirements and besting them when possible.
Typically, computer equipment uses 67 percent of the incoming power while the rest consists of the supplemental services such as cooling, fire safety and lighting. Primarily, data center operators would like to reduce the power demands of the supplemental services, and that’s the challenge for contractors.
“[Data centers have] a huge amount of heat to get rid of,” McFarlane said. “But you can’t just turn up the air conditioner.”
Air distribution must be managed, and closed-couple cooling must be arranged appropriately.
Contractors should also know about the sensors that track temperatures, humidity and other conditions. Some contractors have begun offering their own services or installing those from third parties.
Contractors offering service agreements have plenty of company. Data center technicians are starting their own businesses, as are system integrators and those in heating, ventilating and air conditioning. Those who want to be most successful may have to do more than just specialize.
Facility Gateway, Madison, Wis., launched eight years ago to serve the data center market. It offers maintenance and service, its own data center for use as needed by customers, and design/build services for renovations and new constructions. To meet the needs of an industry under pressure, it had to offer full solutions that consisted of ensuring 100 percent uptime, data security and energy efficiency.
Ross Hammer, Facility Gateway sales vice president, said each service contract may be different, and his company has opted to offer different solution packages based on the kind of data center and its specific needs. Some data centers have a clear understanding of the equipment and its needs, and they play a key role in helping a service provider put together a maintenance package. Others may come to a service provider with aging equipment, demand on their electrical system and a request for recommendations. That’s why companies such as Facility Gateway must be prepared to step in with a solution the data center owner may not have been previously aware of.
Either way, “the heart of Facility Gateway is providing maintenance,” Hammer said. “We’re a service provider.”
When the company provides design/build services, it is able to build in the maintenance system that would be of use to the customer.
“The attention that’s paid to security and uptime has become so important because of the value of data,” Hammer said. “Maintenance is no longer a luxury; it’s a need.”
In the past, it may have been acceptable to use a Band-Aid approach to problems. Today, having an uptime guarantee of 99.9 percent isn’t good enough. No data center can afford to lose power or data at any time.
The nature of data centers is changing as more cloud-based solutions become ubiquitous. Today, the mega-sized data centers are being replaced by multiple smaller data centers set up in proximity to the clients they are serving. They can then go to companies such as Facility Gateway, which has its own network operations center (NOC) that monitors electrical output, power quality, temperature and other sensor-based data. Facility Gateway initially worked with its customers’ existing monitoring systems, but, about two years ago, it began partnering with another provider so that it could offer its own solution to those who needed one.
Already, most skilled data center designers are considering equipment maintenance during the design phase. The easier maintenance can be accomplished, the more room there is for growth, renovations and assurance that data center operations remain reliable.
Designing the data center’s electrical systems to achieve maintainability means creating an arrangement where any piece of equipment or system that supplies the computers can be taken offline for maintenance purposes while the load continues to operate.
In the meantime, data centers are moving from 1G and 10G models to 25G optics or more for smaller module form factors, higher port density, lower power consumption and lower cost per bit. That also provides increased performance to leverage existing fiber infrastructure, according to Sunnyvale, Calif.-based Finisar Corp.
About three years ago, ASHRAE published its 90.1 rule, which standardized the installation of cooling towers for data center air conditioning systems. While not always realistic, it makes running a data center more economical. McFarlane said the resulting discussion led to the 90.4 rule, which is being published this summer and allows data centers to use a variety of technologies to accomplish the required efficiency, not just cooling towers.
Technology continues to evolve. For instance, fire safety systems offer dry fire retardants that won’t damage servers. APC by Schneider Electric has a lithium-ion battery for data centers. Uninterruptible power supplies are modular to provide redundancy when needed.
“[As standards change], contractors need to know what the parameters are and how to adjust to meet them,” McFarlane said. “Right now, there are tremendous opportunities for maintenance and renovation. However, you’ve got to keep up if you’re going to work in this field.”