“Countries need to accelerate investment in power generation and improve efficiency to keep this growth sustainable.”
International Energy Agency
AI-specific workloads require enormous computing power that pushes data centre infrastructure to its limits. Some regions are already hitting limits, with grid operators warning about the pace and scale of demand from AI facilities.
There is also a real environmental cost to AI. Data centres consume vast amounts of water for cooling. In 2025, North American data centres used nearly 1 trillion litres of water, while global AI operations are producing emissions comparable to major cities. Some studies even suggest that clusters of large data centres generate so much heat that they are raising temperatures in surrounding areas.
Critics argue this is an unsustainable trajectory because, at a certain point, AI expansion becomes less about intelligence and more about brute-force scaling: more chips, more energy, more water and more money. Today’s systems often rely on sheer computational power rather than efficiency, raising concerns over AI’s long-term feasibility.
AI infrastructure is no longer just a tech issue — it is a power grid issue. Large-scale training workloads create sudden spikes in electricity demand, requiring major upgrades to national energy systems and long-term planning.
However, there is reason for optimism. Efficiency is improving too. New approaches promise to reduce AI energy consumption dramatically, while innovations in cooling, chip design and renewable energy are already being deployed. AI itself may even help optimise energy systems and logistics in other sectors, offsetting some of its own footprint.
The reality is that AI is not limitless; it has real physical constraints. But constraints can also drive innovation. The challenge now is not whether AI can scale, but whether it can do so sustainably and efficiently.