AI Magazine September 2025 | Page 26

THE AI INTERVIEW
CREDIT: LIQUIDSTACK

Behind every ChatGPT query, every AI-generated image and every machine learning( ML) model lies a fundamental physical reality: heat.

Since AI is quickly becoming the beating heart of global business operations, the infrastructure supporting it is facing a thermal crisis that threatens to hold AI’ s development back.
The numbers are staggering. OpenAI’ s GPT-4 training reportedly consumed enough electricity to power a small city for weeks. Microsoft’ s AI operations have pushed the company’ s carbon emissions up by nearly 30 % since 2020. Meanwhile, individual AI chips now consume 2,000 watts – more than a household microwave running continuously – with 5,000-watt processors already in development.
Traditional air cooling, the mainstay of data centres for decades, simply cannot cope.
This infrastructure reckoning has catapulted liquid cooling from a niche technology to a necessity.
At the centre of this change sits Liquid Stack, a company that CEO Joe Capes describes as having found itself“ in the
26 September 2025