The rise of artificial intelligence is forcing a radical redesign of data centers into specialized “AI factories” that consume massive amounts of electricity in very small spaces. According to the authors of the newly released report, these facilities require far more intensive cooling systems and specialized power equipment to handle extremely fast and large swings in energy demand that occur during complex AI tasks.
“Two trends make AI factories different from traditional data centres. First, they are becoming increasingly power-dense, meaning they have higher power demand per unit of space. Second, they tend to house much more variable electrical loads.” The report notes that an individual server rack using upcoming technology could have a “peak power draw equivalent to that of around 65 households.”
In simple terms, AI computers are being packed much tighter together than standard internet servers, generating a level of heat that requires liquid cooling rather than just air fans. Because AI models work in bursts—switching rapidly between heavy calculations and sharing data—the amount of power they need can spike or drop in a fraction of a second. This rapid change in demand is so extreme that it can strain electrical equipment and threaten the stability of local power grids, prompting designers to install large onsite batteries and high-tech electrical converters to act as a buffer.
The report “Key Questions on Energy and AI” was published in April 2026 by the International Energy Agency in Paris, France. Part of the World Energy Outlook Special Report series, the analysis was prepared by a team led by Thomas Spencer and Siddharth Singh under the direction of Laura Cozzi. The publication provides a comprehensive assessment of the rapidly evolving intersection between artificial intelligence, data center power demands, and global energy markets.