Artificial intelligence (A.I.) is all the rage and seems to be in every other business conversation. What’s not always highlighted, however, is the notable drain on power grids it precipitates.
A.I. drives energy costs
IBM explains that A.I. is technology that enables computers to simulate human learning, problem-solving, decision-making, creativity and autonomy. Generative A.I. refers to deep-learning models that can generate high-quality text, images and other content based on the data it was trained on.
According to Wired Magazine, generative A.I. is now deeply embedded in the online experience, with tools such as ChatGPT spurring its widespread adoption across platforms. However, the rise of A.I. tools powered by large language models comes with significant environmental costs.
A.I. systems require vast amounts of computational power, making them highly resource-intensive compared to traditional technologies such as search engines. These demands significantly increase the electricity and water data centers consume.
Google and Microsoft have already struggled with sustainability goals due to A.I.’s growing demands. While data centers’ energy use is still smaller compared to sectors such as transportation, the trend is concerning. The added strain on natural resources is also a critical issue, with data centers evaporating significant amounts of water into the atmosphere. These environmental impacts are likely to grow as A.I. adoption continues to expand.
A doubling of energy consumption
There’s also talk that the energy consumption and demand from A.I. alone could double over the next few years. According to an article in Vox, the International Energy Agency (IEA) has, for the first time, projected energy consumption from data centers, cryptocurrency and A.I., estimating it accounted for nearly 2% of global energy use in 2022. This demand is expected to double by 2026, equaling the electricity consumption for all of Japan, and Goldman Sachs estimates it will grow 160% by 2030.
Generative A.I. is particularly energy-intensive, as training large language models such as GPT-3 uses as much electricity as 130 U.S. homes annually. A single ChatGPT query consumes 2.9 watt-hours, nearly 10 times more than a Google search, according to reporting by Goldman Sachs.
Balancing the benefits of digital technologies with their environmental footprint is crucial. The IEA emphasizes the need for greener A.I. and greater transparency in energy use. Consumers must also be aware of their data’s real-world costs, recognizing the environmental trade-offs in the growing digital landscape.
Can A.I. help solve these problems?
To address these challenges, several companies are taking proactive measures to improve A.I.’s energy efficiency. One solution is optimizing data center operations. Many tech firms are using A.I. to help resolve the problems it created.
For example, some are adopting A.I.-driven energy management systems that optimize cooling, power distribution and overall energy use in data centers. Google, for instance, uses DeepMind’s A.I. to reduce its data center cooling energy by up to 40%.
Additionally, NVIDIA and Intel are finding hardware solutions by designing energy-efficient A.I. chips. These chips, built with advanced architectures, reduce power consumption while maintaining high computational performance. By focusing on hardware efficiency, firms aim to reduce the energy footprint of A.I. training and inference.
Another approach involves shifting energy supply to the data center and A.I. computing toward renewable energy sources. Tech giants such as Microsoft, Amazon and Google have committed to powering their data centers with 100% renewable energy, reducing their reliance on fossil fuels.
In parallel, there’s a growing interest in sustainable data center designs such as liquid cooling systems, and the strategic placement of data centers in regions with abundant renewable energy, like Iceland.
As A.I.-driven energy consumption surges, companies are focusing on improving energy efficiency through advanced hardware, sustainable data center practices and A.I. optimizations. These efforts are crucial to ensuring that A.I.’s benefits can be realized without exacerbating the global energy crisis.
stock.adobe.com / Shuo / Dmytro Tolokonov
About The Author
ROMEO is a freelance writer based in Chesapeake, Va. He focuses on business and technology topics. Find him at www.JimRomeo.net.