AI and the future of data center cooling

Bill Gates once said that AI can solve its own energy consumption problem. In data centers, where energy use is surging, this is not a philosophical debate— it’s a numbers game. And the numbers are getting urgent.
BY MAURYCY SZWAJKAJZER, CEO OF SZE
Today, data centers consume 1–2% of global electricity, making them the fifth-largest consumer world-wide, just behind transportation. By 2030, that share is expected to rise to 3–4%. Cooling alone accounts for around 0.7 kW for every 1 kW used by IT equipment. This inefficiency is what I have dedicated my professional life to solving — and where AI is poised to make a transformative impact.
WHY COOLING IS A GROWING PROBLEM
According to estimates by Big 4 consultancies, data centers today consume around 536 TWh per year. A global improvement in cooling efficiency — reducing the so-called Power Usage Effectiveness (PUE) — could save up to 157 TWh annually, roughly equivalent to Poland’s entire annual electricity consumption.
Cooling a data center seems simple in theory: every 1 kW consumed must be removed as heat. In practice, however, it’s messy. Two systems dominate:
Despite technical advances, even modern data centers often suffer from uneven cooling, with hot spots, cold spots, and inefficient airflow. Standard PID (Proportional-Integral-Derivative) control algorithms cannot respond dynamically enough to real-world variations. As a result, data centers are often overcooled “just in case,” wasting vast amounts of energy.
WHY TRADITIONAL SYSTEMS FALL SHORT
Several major problems plague traditional cooling systems:
All of this results in cooling systems that are less about precision and more about guesswork.
WHERE AI MAKES THE DIFFERENCE
The solution lies in intelligent, real-time control. By installing temperature sensors throughout the server room, we can build a full 3D thermal map. AI algorithms then learn how airflow and fan settings affect this environment.
Through a process of continuous data collection and supervised machine learning — always within strict safety parameters — the AI can optimize airflow, fan speeds, and temperature settings dynamically. In practice, the system acts before problems arise, maintaining safe operating conditions with far lower energy use.
Crucially, the AI doesn’t need to “understand” thermodynamics. It identifies and exploits patterns. However, proper oversight is critical — just as you wouldn’t trust an AI-generated image to always get human anatomy right, you don’t want AI independently improvising with server cooling.
Once trained, the system can operate autonomously, adjusting in real time based on changing server loads, external temperatures, or even energy prices through integration with day-ahead energy markets and renewable power sources.
THE IMPACT IN REAL NUMBERS
In real-world deployments, AI-based cooling systems can cut energy use by 35–40% compared to traditional PID control. PUE can drop from around 1.7 to 1.42 — without needing to change the underlying hardware.
Other cooling methods, such as direct-to-chip systems, can also benefit from AI optimization. However, data collection for these systems must be adapted, since the thermal behavior differs from traditional air-cooled setups. In fact, many servers used to run AI models, because of their extreme heat density, already rely on direct-to-chip cooling as standard practice.
WHAT’S NEXT?
The biggest challenge isn’t the technology — it’s bridging the gap between the digital world of AI and the analog reality of physical systems. Reliable, systematic data collection and secure communication with cooling infrastructure are critical.
Still, the direction is clear: with careful engineering and disciplined oversight, AI can significantly reduce the energy footprint of the very systems it helps power. Bill Gates was right. AI isn’t just reshaping data centers — it’s solving its own energy consumption problem.