Setting the Record Straight: AI Data Centres and Water Use

By Alistair Barnes, Head of Mechanical Engineering at Colt DCS.

  • Monday, 30th March 2026 Posted 2 hours ago in by Phil Alsop

Artificial intelligence (AI) continues to transform industry at an unprecedented pace, accelerating creative breakthroughs, automating previously time-consuming manual tasks, and redefining how we approach complex problem-solving.  

However, the growth of AI remains reliant on a critical enabler: data centres. These facilities are the beating heart of AI, housing the High-Performance Computing (HPC) systems that train and run generative AI models. Unlike traditional data centres, those designed for AI platforms must handle enormous computational intensity, which translates into significantly higher energy consumption and heat generation.  

To keep these systems running efficiently, cooling becomes a top priority. Some traditional cooling methods are resource-intensive because they rely on continuous water draw and evaporation to dissipate heat. This has fueled a media narrative which portrays data centres as wasteful water guzzlers that negatively impact the environment. The narrative resonates because water scarcity is a global issue, and any technology perceived as resource heavy faces scrutiny. Yet, the reality is more nuanced. While older data centre designs did have inefficiencies, modern facilities are evolving rapidly to address these concerns. 

 

The role of water in data centre cooling 

 

Cooling is essential to prevent overheating and maintain optimal performance in data centres. In the past, facilities have relied almost entirely on air-based cooling systems, such as CRAC (Computer Room Air Conditioning) units, which use chilled air to remove heat from servers. While these approaches were suitable for lower or moderate rack densities, alone, they struggle to meet the demands of modern AI workloads, where rack densities have increased significantly due to the intensive power requirements of Graphic Processing Units (GPUs) and HPC processors. 

As AI adoption accelerates, data centre operators are increasingly moving toward liquid cooling in new, AI ready facilities. Liquid cooling systems transport cooled water or specialised coolant directly to high heat components such as GPUs, allowing heat to be removed at the source. This method is far more efficient because liquid transfers heat far more effectively than air.  

By improving heat removal and enabling higher density deployments, liquid cooling reduces the risk of hardware failure and enhances overall system reliability. Extending hardware lifespan also helps reduce the long-term environmental impact associated with replacing IT equipment, especially as global ewaste generation is projected to reach 82 million tonnes worldwide by 2030. 

 

Closed-loop cooling systems and hybrid approaches 

 

Newly designed AI data centres offer reasons for optimism, with advanced cooling and energy efficiency principles now built into their core architecture. Innovations such as closedloop liquid cooling systems further expand on traditional designs by circulating coolant in a sealed loop, transferring heat without relying on evaporation. This approach dramatically reduces water discharge and enables near zero wastewater during operation. 

 

At the same time, many operators are adopting hybrid cooling strategies, combining methods such as liquid to chip cooling with air cooling systems to support dense AI racks. This method, which delivers coolant directly to the heat-generating components via cold plates mounted on the hardware, ensures high thermal performance while also optimising Power Usage Effectiveness (PUE). In parallel, hybrid cooling architectures also offer a practical upgrade path for existing data centres that must continue operating without interruption. Rather than rebuilding entire facilities around liquid cooling, operators can retrofit liquid‑to‑chip systems into selected AI clusters while maintaining traditional air‑cooled environments elsewhere. 

 

Futhermore, some facilities are achieving near-zero water waste by using reclaimed water sources or integrating with district cooling systems to reduce environmental impact. For example, Amazon Web Services is expanding the use of recycled wastewater across 120 U.S. data centres as part of its goal to become “water positive” by 2030. These innovations highlight that the data centre industry isn’t ignoring the problem: it’s taking sufficient steps to solve it.  

 

Can AI growth and sustainability coexist? 

 

Absolutely. New design models for data centres prove that AI growth and sustainability need not be mutually exclusive. Next-generation cooling technologies, including liquid and hybrid set ups that utilise closed-loop cooling systems, are already significantly reducing wastewater in data centres, and enabling facilities to support ever higher compute densities with far greater efficiency. 

Therefore, it’s time to rewrite the media narrative. With smart, considered design and responsible resource management, data centres can meet the surging demand of AI whilst safeguarding one of the planet’s most precious resources: water.  

By Matt Evans, CEO at Apx Data Centre Solutions.
Harmonics pose a headache for electrical systems in many industries, but data centres are uniquely...
By James Rogers Jones, Head of Sustainable Development, BCS Consultancy.
AI workloads are pushing datacentre infrastructure towards higher rack densities, new cooling...
By David Trossell, CEO and CTO at Bridgeworks.
By Philip Silk, Development Director at Conrad Energy.
By Mark Lewis, Chief Marketing Officer at Pulsant.
By Louis Charlton, CEO of Global Commissioning.