Back to AI Glossary
AI Governance & Ethics

What is AI Sustainability?

AI Sustainability is the practice of considering and minimising the environmental impact of artificial intelligence systems throughout their lifecycle, including the energy consumed during model training and inference, the carbon footprint of supporting infrastructure, and the broader ecological consequences of AI deployment at scale.

What is AI Sustainability?

AI Sustainability refers to the practice of developing, deploying, and operating AI systems in ways that minimise their environmental impact and support long-term ecological balance. It encompasses the energy consumption of AI model training and inference, the carbon footprint of the data centres that power AI workloads, the electronic waste generated by specialised AI hardware, and the broader environmental consequences of scaling AI across industries and geographies.

The environmental cost of AI has become a significant concern as models grow larger and more computationally intensive. Training a single large language model can consume as much energy as several households use in a year and produce carbon emissions equivalent to multiple transatlantic flights. As organisations across Southeast Asia and globally accelerate AI adoption, the cumulative environmental impact is substantial and growing.

AI Sustainability asks organisations to account for these environmental costs alongside the business benefits of AI and to actively pursue strategies that reduce AI's ecological footprint.

Why AI Sustainability Matters

Growing Environmental Footprint

The computational requirements of AI have increased dramatically. Research from the University of Massachusetts Amherst found that training a large transformer model can emit over 280,000 kilograms of carbon dioxide. As organisations train more models, use larger datasets, and run inference at greater scale, the aggregate environmental impact grows proportionally.

Data centres, which power AI workloads, already account for approximately 1-1.5 percent of global electricity consumption. AI workloads are among the most energy-intensive in these facilities, and demand is growing rapidly. The International Energy Agency projects that data centre electricity consumption could double by 2026, driven significantly by AI.

Regulatory and Stakeholder Pressure

Environmental sustainability is a growing regulatory and stakeholder concern globally. The EU's Corporate Sustainability Reporting Directive (CSRD) requires companies to report on environmental impacts, which increasingly includes the energy consumption of digital and AI operations. Singapore's Green Plan 2030 sets ambitious environmental targets. Investors, customers, and employees are all paying more attention to organisations' environmental practices.

Climate Commitments

Many organisations have made public commitments to reduce carbon emissions and achieve net-zero targets. AI deployments can undermine these commitments if their environmental impact is not measured and managed. Organisations that invest heavily in AI without accounting for its carbon footprint may find that their AI strategy conflicts with their sustainability strategy.

Southeast Asian Context

Southeast Asia is particularly vulnerable to climate change, with rising sea levels, increased flooding, and more extreme weather events affecting the region. Data centre construction is booming across the region, with Singapore, Indonesia, Malaysia, and Thailand all expanding capacity to support AI and cloud workloads. The environmental impact of this expansion is a growing concern for governments, communities, and businesses.

Sources of AI Environmental Impact

Training

Training AI models, particularly large neural networks, requires enormous computational resources. The process involves running millions or billions of calculations across thousands of specialised processors for days, weeks, or even months. The energy consumed during training depends on the model size, the dataset size, the hardware used, and the number of training iterations.

Inference

Once trained, AI models consume energy every time they make a prediction or generate an output. While a single inference is much less energy-intensive than training, inference happens at massive scale. A popular AI service may process millions of requests per day, and the cumulative energy consumption of inference often exceeds that of training over the model's lifetime.

Data Storage and Processing

AI systems require vast amounts of data for training and operation. Storing, processing, and moving this data consumes energy and requires infrastructure. Data preprocessing, feature engineering, and data pipeline operations all contribute to AI's environmental footprint.

Hardware Manufacturing and Disposal

The specialised hardware used for AI, particularly GPUs and TPUs, requires significant energy and resources to manufacture. These components have limited lifespans and contribute to electronic waste when retired. The supply chain for AI hardware, from rare earth mineral mining to chip fabrication, carries its own environmental costs.

Cooling Infrastructure

AI workloads generate substantial heat, requiring significant cooling infrastructure in data centres. In tropical climates like Southeast Asia, cooling requirements are particularly high, increasing both energy consumption and, in some cases, water usage.

Strategies for Sustainable AI

Efficient Model Design

Choose model architectures that balance performance with computational efficiency. Not every problem requires the largest possible model. Techniques such as model distillation, pruning, and quantisation can significantly reduce a model's computational requirements while maintaining acceptable performance.

Optimise Training Processes

Reduce training energy consumption through techniques such as transfer learning (starting from pre-trained models rather than training from scratch), early stopping (halting training when performance improvement plateaus), and hyperparameter optimisation (finding efficient training configurations).

Green Infrastructure

Choose cloud providers and data centres that use renewable energy. Several major cloud providers offer regions powered primarily by renewable sources. When building or selecting data centres in Southeast Asia, consider locations and providers with strong renewable energy commitments.

Right-Size AI Deployments

Not every use case requires real-time AI inference from a large model. Consider whether batch processing, simpler models, or rule-based systems can achieve acceptable results with lower environmental cost. Match the AI solution to the problem rather than defaulting to the most powerful option.

Measure and Report

You cannot manage what you do not measure. Track the energy consumption and carbon emissions associated with your AI workloads. Several tools and frameworks are available for measuring AI carbon footprints, including CodeCarbon, ML CO2 Impact, and cloud provider sustainability dashboards.

Carbon Offsetting and Compensation

While reducing AI's environmental impact is the primary goal, carbon offsetting can complement reduction efforts. Invest in verified carbon offset projects, preferably in the regions where your AI systems operate. However, offsets should supplement, not replace, genuine reduction efforts.

AI Sustainability in Southeast Asia

The region faces unique sustainability challenges related to AI. Tropical climates increase data centre cooling requirements, raising energy consumption. Several ASEAN countries rely heavily on fossil fuels for electricity generation, meaning that AI workloads in these markets have a higher carbon intensity than in markets with cleaner energy grids.

Singapore, which hosts a significant concentration of data centres in the region, has implemented a temporary moratorium on new data centre construction (since partially relaxed) due to sustainability concerns. The Singapore government requires new data centres to meet green certification standards and is investing in district cooling systems to improve efficiency.

Malaysia's data centre expansion in Johor has raised environmental concerns related to energy and water consumption. Indonesia and Thailand are also grappling with the environmental implications of data centre growth driven by AI demand.

For organisations operating AI workloads in Southeast Asia, sustainability is becoming both a regulatory requirement and a business consideration. Energy costs in the region can be significant, so efficiency improvements deliver both environmental and financial benefits.

Why It Matters for Business

AI Sustainability is an increasingly material business concern that affects your operating costs, regulatory compliance, brand reputation, and ability to meet corporate sustainability commitments. As AI workloads grow, so do their energy costs and carbon emissions. Organisations that fail to manage AI's environmental impact face rising operational expenses, regulatory scrutiny, and reputational risk with environmentally conscious customers and investors.

For CEOs, AI sustainability aligns with broader corporate responsibility and ESG commitments. Investors and stakeholders are scrutinising the environmental impact of digital operations, and AI is a growing component. Demonstrating sustainable AI practices strengthens your sustainability narrative and reduces the risk of greenwashing accusations. For CTOs, sustainability translates to efficiency. Optimising AI systems for energy consumption typically also reduces infrastructure costs and improves performance.

In Southeast Asia, where data centre expansion is accelerating and climate vulnerability is high, sustainable AI is particularly relevant. Governments in Singapore, Malaysia, and Indonesia are introducing sustainability requirements for data centres and digital infrastructure. Organisations that build sustainability into their AI operations now will comply more easily with these requirements and benefit from lower energy costs in a region where electricity prices are a significant operational consideration.

Key Considerations
  • Measure the energy consumption and carbon footprint of your AI workloads as a baseline for improvement.
  • Choose model architectures and training approaches that balance performance with computational efficiency, avoiding unnecessarily large models.
  • Select cloud providers and data centre locations with strong renewable energy commitments, particularly relevant in Southeast Asia where energy sources vary significantly by market.
  • Use techniques such as model distillation, pruning, transfer learning, and quantisation to reduce computational requirements without sacrificing essential performance.
  • Include environmental impact in AI project evaluation criteria alongside business value and technical performance.
  • Monitor regulatory developments related to data centre sustainability in your Southeast Asian markets, particularly in Singapore and Malaysia.
  • Report AI energy consumption and emissions as part of your broader corporate sustainability reporting.
  • Consider the full lifecycle environmental impact of AI hardware, including manufacturing, operation, and disposal.

Frequently Asked Questions

How much energy does training an AI model actually consume?

Energy consumption varies enormously depending on the model size and type. Training a small machine learning model on a laptop might use a trivial amount of energy. Training a large language model with billions of parameters can consume hundreds of megawatt-hours of electricity, equivalent to the annual energy consumption of dozens of households, and emit hundreds of tonnes of CO2. The key factors are model size, dataset size, hardware type, training duration, and the carbon intensity of the electricity source. Tools like CodeCarbon can estimate the energy consumption of specific training runs.

Is AI sustainability just about reducing carbon emissions?

Carbon emissions are the most discussed aspect, but AI sustainability encompasses broader environmental concerns. These include water consumption for data centre cooling, which is particularly significant in water-stressed regions. Electronic waste from specialised AI hardware that becomes obsolete quickly. Resource extraction for manufacturing GPUs and other AI components. Land use for data centre construction. A comprehensive approach to AI sustainability addresses all these dimensions, not just carbon.

More Questions

Smaller companies can take several practical steps. Use cloud providers with strong sustainability commitments rather than operating private infrastructure. Choose efficient model architectures and avoid training unnecessarily large models. Use pre-trained models and transfer learning to reduce training requirements. Batch inference workloads rather than running models continuously. Select cloud regions powered by renewable energy. These steps reduce both environmental impact and costs, making sustainability accessible regardless of budget size.

Need help implementing AI Sustainability?

Pertama Partners helps businesses across Southeast Asia adopt AI strategically. Let's discuss how ai sustainability fits into your AI roadmap.