Back to AI Glossary
AI Sustainability & Green AI

What is AI Water Usage?

AI Water Usage refers to water consumed for cooling data center servers running AI workloads, creating environmental stress in water-scarce regions. Water footprint is an often-overlooked environmental impact of large-scale AI.

This AI sustainability term is currently being developed. Detailed content covering environmental impact, optimization strategies, implementation approaches, and use cases will be added soon. For immediate guidance on sustainable AI development and green computing strategies, contact Pertama Partners for advisory services.

Why It Matters for Business

A single large language model training run consumes 700,000 liters of freshwater for cooling, creating material sustainability risks in water-stressed regions across Southeast Asia. Data center operators face increasing municipal water allocation restrictions that can constrain expansion plans. Companies proactively reducing AI water intensity gain preferential treatment in ESG-conscious enterprise procurement processes worth billions annually.

Key Considerations
  • Data centers use evaporative cooling (consumes water).
  • GPT-3 training: estimated 700,000 liters (disputed estimate).
  • Water scarcity concerns in arid data center locations.
  • Alternative cooling: air cooling, liquid cooling, heat reuse.
  • Water footprint varies by cooling technology and climate.
  • Disclosure and measurement improving but incomplete.
  • Monitor cooling water consumption per GPU-hour across data center locations since evaporative cooling in tropical climates consumes 3-5x more water.
  • Evaluate closed-loop liquid cooling systems that recirculate water rather than consuming it through evaporation in conventional cooling tower designs.
  • Include water usage metrics in AI vendor procurement scorecards alongside traditional performance, price, and reliability evaluation criteria.

Common Questions

How much energy does AI actually use?

Training large language models can emit 300+ tons of CO2 (equivalent to 125 flights NYC-Beijing). Inference for deployed models consumes ongoing energy. Google reported AI accounted for 10-15% of their data center energy in 2023. Energy use scales with model size and usage.

How can we reduce AI carbon footprint?

Strategies include: compute-optimal training (smaller models trained longer), model compression, using renewable-powered data centers, efficient hardware (specialized AI chips), batching requests, caching results, and choosing models appropriately sized for tasks.

More Questions

Not necessarily. Compute-optimal training (Chinchilla scaling) achieves same performance with less compute. Efficient architectures (MoE, pruning) maintain quality while reducing resources. The goal is performance-per-watt optimization, not performance reduction.

References

  1. NIST Artificial Intelligence Risk Management Framework (AI RMF 1.0). National Institute of Standards and Technology (NIST) (2023). View source
  2. Stanford HAI AI Index Report 2025. Stanford Institute for Human-Centered AI (2025). View source

Need help implementing AI Water Usage?

Pertama Partners helps businesses across Southeast Asia adopt AI strategically. Let's discuss how ai water usage fits into your AI roadmap.