How Much Water Does ChatGPT Really Use? Can AI Become Water Neutral by 2030?

Published on 07-03-2026 by Muhammad Bilal Aftab


How Much Water Does ChatGPT Really Use? Can AI Become Water Neutral by 2030?


OpenAI CEO Sam Altman recently revealed that a single message to ChatGPT uses about 1/15 of a teaspoon of water. That means even a small question like “Can you help me with this maths problem?” or “Can I use lime instead of lemon?” consumes a tiny drop of water.

It may sound negligible, but consider this: around 1 billion messages are sent to ChatGPT every day. And ChatGPT is just one AI tool. Others like Google’s Gemini, Claude from Anthropic, and DeepSeek also add to this usage. Collectively, AI consumes a massive amount of water daily.

Some experts, however, are skeptical about the teaspoon estimate. Public data is limited. Research suggests that 10 to 50 questions sent to a medium-sized AI model could use around 500 milliliters of water, about half a litre. This includes water for cooling servers and generating the electricity they consume.

OpenAI has not shared more detailed data, but one thing is clear: AI systems rely heavily on water.


Why Does AI Need Water?

Every AI prompt triggers complex calculations on powerful computer chips in massive data centres. Even before you use AI, these models undergo training, which requires enormous computing power.

All this computation produces heat. Without proper cooling, machines can overheat and fail. While older data centres often used air cooling, modern AI requires liquid cooling systems due to higher energy demands.


How Liquid Cooling Works

Liquid cooling uses a special coolant that flows over computer chips, absorbing heat and carrying it to a heat exchange unit. Here, water lowers the temperature of the coolant before it returns to the servers.

The heated water is sent to cooling towers, where fans and evaporation release heat into the air. In some systems, up to 80% of water is lost through evaporation, which cannot be immediately reused. This water is drawn from local sources shared with communities for drinking, farming, and daily life.


Communities Are Concerned

Many communities worldwide are worried about data centres putting pressure on local water supplies and electricity grids. Protests have occurred in Spain, India, Chile, Uruguay, and the United States.

Water is not only used directly for cooling. Electricity generation for AI also consumes water. Coal, gas, and nuclear plants rely on water to produce steam that turns turbines. The International Energy Agency predicts that electricity demand from AI-focused data centres could grow 400% by 2030, reaching 300 terawatt hours, roughly equal to the UK’s yearly electricity use.

Water is also essential in manufacturing semiconductor chips, cleaning and processing the materials required to build AI hardware. In short, water is used at every stage, from mining raw materials to running and cooling data centres.


Can AI Become More Sustainable?

Big tech companies like Google, Microsoft, and Meta report using billions of litres of water in their data centres annually. Many have promised to become water neutral by 2030, returning as much water to the environment as they use.

Innovations include:

  • Cooling systems that avoid water evaporation
  • Using waste heat to warm homes
  • Considering Arctic, underwater, or even space-based data centres. For instance, NTT is exploring whether satellites could handle some computing tasks.

A Young Technology With Big Questions

Generative AI has grown rapidly, but the industry is still learning. Water and energy are shared global resources, and AI’s expansion will require collaboration between companies, governments, and communities to reduce environmental impact.

AI brings enormous benefits, but its environmental cost is real. The challenge is ensuring innovation does not come at the expense of essential planetary resources.


Analysis by Muhammad Bilal Aftab, Business Growth Strategist and Digital Consultant