The Internet of Things (IoT) may alter the appearance of data centers. The IoT is a growing system of data that puts demand on existing data centers and pushes for new facilities that are centrally located and “on the edge,” a trendy phrase to describe the point at which sensors are installed. However, beyond creating the need for more data center support, the IoT also serves as a tool to boost energy efficiency.

Industry analysts have forecast exponential growth in IoT-based systems and related sensors, and more data means more servers and more power, said Eugene Signorini, senior analyst at Enterprise Strategy Group, Milford, Mass. 

Since IoT systems are being installed in a variety of environments, the installing contractor must determine the best approach for the application. Signorini said there is no one-size-fits-all solution. After all, collecting data from a refrigerator in a home requires a different kind of system than, say, measuring conditions at an oil rig.

“The best way to get information from the edge is not always a centralized data center,” Signorini said. “Sometimes, processing and storage has to take place at the edge.”

For example, if a sensor is tracking a pump or valve’s operation, the filtering and analytics of that data is often happening right there at the facility so that the shutdown mechanism, for instance, can provide a local response to a problem. 

Consumer appliances and vehicles are collecting sensor data that is being routed to cloud-based servers.

For these reasons, data centers are going to be more plentiful, with more types and sizes scattered across the United States and around the world.

A soft launch for the IoT?

Despite all the development, however, the IoT is still in an exploratory phase, and it may be experiencing more hype than practical installation. In a few years, though, Signorini expects the scale-up to be significant.

“Companies are beginning to dip their toes into the IoT, and they may discover data volumes are much greater than they expected,” he said.

David Cappuccio, analyst at Stamford, Conn.-based IT research company Gartner, said immediate demand for IoT services in such places as manufacturing facilities is leading to the addition of what he calls microdata centers on the edge, right in a user’s facility and managed by its IT department. The data is collected locally and, therefore, can be acted on immediately.

Another IoT trend that is putting pressure on data centers is the large-scale consumer deployments, such as car manufacturers that will have sensors in all of their cars, communicating with cloud-based servers that will collect and analyze data, but can’t act on it (such as shutting a system down if it is operating in an unsafe way).


Companies are beginning to dip their toes into the IoT, and they may discover data volumes are much greater than they expected.

—Eugene Signorini, Enterprise Strategy Group


The primary challenge today is how to keep pace with the demand from the cloud, IoT and mobile technologies. Michael Bord, marketing manager at data center environmental-sensor company Raritan in Somerset, N.J., finds that more managers are looking for monitoring tools to help manage the increasing complexity of data center infrastructures.

“They want easy-to-deploy-and-use tools that do not add another layer of complexity,” he said. “They are asking us for environmental monitoring as well as monitoring of the power chain, capacity and other infrastructure items.”

Heat and energy concerns

As data analysis increases over the coming years, data centers need more power, and server heat generation will rise. Those higher temperatures will require management from IoT systems.

One of the most significant results of data center growth is heat density, Bord said. It has become more difficult to manage temperatures on a facility level because rack densities and the resulting rack heat vary so widely based on demands created by IoT data.

“As a result, we see hot spots in one zone and cooler spots in another zone,” Bord said.

According to the U.S. Department of Energy (DOE), data centers consume about 2 percent of all energy used in the country. The DOE forecasts that, by 2020, the carbon footprint of data centers will exceed that of the airline industry. The energy to power the servers in each data center is divided into the actual server operation and the ancillary energy demands, such as cooling. Almost 50 percent of data center energy is consumed by non-IT loads—cooling systems, fans, dehumidification and lighting.

In the federal sector alone, agencies lease space from the U.S. General Services Administration (GSA) to operate more than 1,400 data centers, so the DOE is one body concerned with lowering these computing centers’ consumption. 

Improving data center energy performance enables managers to meet federal greenhouse-gas emission-reduction mandates while cutting operating costs and energy use. By doing so, they also better prepare for future growth when more data requires more power and cooling.

Sensing problems

The DOE is looking for low-cost, easy-to-install solutions. This year, it awarded $19 million to fund 18 projects aimed at improving the energy efficiency of data centers and other U.S. buildings. These projects include wireless sensors, using printed or thin-film technology in some cases that can be peeled and stuck to walls. The sensors collect temperature or other data and transmit it using radio frequency identification to readers in the facility. These projects are expected to take several years before prototypes are developed.

Installing temperature sensors with network connectivity can help IT administrators look for those hot spots to ensure all equipment is operating safely. If not, early warning can enable administrators to boost the cooling.


By incorporating smart technologies ino the data center, facility managers will be able to track the real-time status of components and environmental measurements to keep operations flowing smoothly.


IoT sensors provide a relatively affordable way for data center operators to visualize and implement system changes that lower the overall energy consumption.

Some data center managers have used networked environmental sensors for years to keep their facilities and servers working at an optimum level. Monitoring for temperature, humidity and airflow, these sensors provide information for efficiency improvements and to forecast and prevent damage to equipment, Bord said.

More servers are being added to cabinets, increasing the density. As a result, power and cooling are being challenged.

“[Environmental sensors] make it easy to spot hot pockets due to poor circulation, detect water in the event of a pipe leak, and identify problems with humidity that can lead to electrostatic discharge or corrosion,” Bord said.

In the past few years, environmental-monitoring solutions have improved and help save on energy, boost reliability, and help find additional power capacity in data centers.

Raritan sees more customers deploying sensors for measuring temperature, humidity, air pressure, airflow and contact closure (for door, water, smoke), as well as other data collectors, such as intelligent rack power-distribution units.

“What is really exciting is that many of these customers are going to the next level; now that they have intelligent monitoring in the form of DCIM [data center infrastructure management] to analyze environmental and energy information,” Bord said. “And, they are adding more intelligence to the rack, such as asset sensors to track where all the assets are—both IT and facilities assets—and their interdependencies.”

This added intelligence provides real-time views of data center servers, virtual servers, network devices, data connections, power chains, raised floor space and rack elevations. Raritan offers a solution that—when a work order is issued—prompts a light-emitting diode (LED) to start blinking next to a server that needs work.

“Can you imagine how useful this can be in an environment with, say, 100 racks, 15–20 servers per rack, 1,500–2,000 servers, across multiple business unit owners?” Bord asked.

Hard results

The bottom line is that efficiency measures implemented as a result of information provided by the wireless sensor network reduced the demonstration facility’s cooling load by 48 percent and reduced the total data center power usage by 17 percent. This represented an annual savings of 657 megawatt-hours and an improvement in the data center’s power-usage effectiveness from 1.83 to 1.51.

By incorporating smart technologies into the data center, facility managers will be able to track the real-time status of components and environmental measurements to keep operations flowing smoothly. Sensors that measure temperature, humidity and electricity will be combined with network-equipment monitoring, to help data centers maintain their “uptime” and reduce capital and operational expenditures. Data center operators can then have more platforms available to them, including those that integrate data from different sources to keep their computing facilities functioning at optimum capacity.

As a potential installer or integrator, contractors should be aware of the kind of infrastructure they need to install.

“The decisions you make about infrastructure today will have a ripple effect,” Bord said. “You want to make sure that you’re delivering the right amount of power and cooling today and five years down the road. So, choose an approach that balances reliability and scalability.”