Dear Reader,
At Davos this year, leaders spoke of the crucial part that technology has to play in climate adaptation, spotlighting innovations in AI-enabled early warning systems and agriculture applications.
But as we’ve highlighted in previous issues, while AI has many potential applications for climate action, these gains stand in tension with AI’s own water and energy intensive nature. Globally, approx. 1-2% of electricity is consumed by data centres, responsible for about 1% of GHG emissions. These numbers are expected to rise, with predictions that data centres will consume as much electricity as Japan by 2026. With the introduction of ‘leaner models’ like DeepSeek, this discourse is ever-evolving, prompting governments and investors alike to rethink the linkages between AI and energy demands, and highlighting the urgent need for reliable evidence on AI’s resource consumption.
Digital transformation across Asia, while promising economic development and job creation, is linked to rising emissions across the region. Reports warn that rapid AI development may be driving up reliance on fossil fuels, slowing clean energy transitions to power data centres. In the US, plans are in motion to reopen Three Mile Island to power Microsoft’s data centres, the site of the worst nuclear disaster in US history.
At the regional level, emerging data centre hubs in Malaysia, Indonesia, and India face potential water and energy shortages, intensified reliance on fossil fuels, and biodiversity loss. Singapore, which powers 60% of Asia’s data centre needs, imposed a moratorium from 2019 - 2022 on data centre expansion due to land use and grid constraints. During this pause, investment has shifted to neighbouring ASEAN countries, like Malaysia and Indonesia, with significantly lower operating, land, and labour costs, as well as government-backed incentives for foreign direct investment in data centres.
Even as Singapore’s and Malaysia’s governments lay blueprints for more energy-efficient and sustainable data centre operations, and try to prioritise domestic needs, residents in Malaysia’s Johor Bahru are increasingly worried about demands on the energy grid, high real estate costs, and water scarcity.
Curated Reads
In this section, we consider the many innovative approaches to Green AI, from data centre efficiency to hardware improvements and the nuances of making AI resource-efficient. As this is an emerging and fast-developing area of research, many of the readings we came across are pre-prints. Combined with a lack of transparency from model developers on AI’s carbon footprint and energy consumption, researchers also end up making simplistic extrapolations to arrive at resource-use estimates. So keep a close, critical eye on what you read.
Find definitions of some key terms in our climate glossary.
Green AI approaches
Green AI, a term coined by Schwartz and team in 2020, encapsulates approaches that monitor and mitigate carbon emissions and increase resource efficiency in the AI lifecycle.
In a collaborative study, researchers from Japan and Germany emphasise that training and retraining phases are major energy consumption hotspots, making it critical to analyse the computational costs of software used during these stages. Additionally, substantial environmental impacts occur during hardware manufacturing and operational phases.
To advance Green AI, researchers advocate for broader accessibility to knowledge and resources related to sustainable AI models. They recommend that AI programming frameworks not only support parallelism but also include tools such as pruning, low-rank decomposition, and knowledge distillation — methods, collectively referred to as "AI4GreenAI." While optimistic about the potential, they acknowledge that realising this vision will demand significant further research and development efforts.
Greening AI infrastructure
Data centres play a vital role in powering AI — they are the physical infrastructure that stores data, networks of computing resources, and cooling systems.
In Indonesia, researchers propose sustainability standards through the concept of Greenship Data Centre - a certification ratings tool specifically for digital infrastructure in Indonesia. The paper highlights the importance of digital twin technologies to simulate the performance and efficiency of data centres, helping operators identify inefficiencies and optimise energy usage, ultimately reducing the carbon footprint of the data centre. While it states that adopting digital twins poses data security risks and high upfront costs, it rationalises that the long-term benefits of sustainable data centres outweigh these challenges, echoing wider sentiments within the region of not missing out on technological advancements.
Hardware innovations in computing chips, such as 3D chip stacking, Neural Processing Units (NPUs), and Google’s cost-efficient AI chips Tensor Processing Units (TPUs) also promise energy and cost efficiency, faster performance, and customisation of AI tasks and edge computing. Challenges include high acquisition costs and high cooling costs to meet requirements for ideal temperatures to run properly.
There is a lot more nuance to this conversation
Recent literature highlights that a narrow energy-efficiency lens to AI lifecycles is inadequate to address issues around human-induced resource extraction. A systematic review of Green AI acknowledges the promising directions for energy-efficient solutions for AI computing such as structure simplification for deep neural networks, quantising decision tree outputs, and data-centric AI. It also found that Green AI research predominantly relies on image datasets due to their accessibility, but diversifying data types is crucial for a comprehensive understanding of the space.
But more importantly, it points out that the overt focus on energy efficiency as a core metric in this research omits monitoring of CO2 emissions and other environmental indicators that examine the harms of computing more holistically.
The risks of greenwashing
In the race for digital transformation across Asia, the badge of ‘sustainability’ risks creating longer-term environmental inequality. For example, while current efforts towards carbon-aware computing are promising, this article argues that time-based load distribution may only displace energy use, not reduce it, and approaches that allocate tasks using geographical load balancing to clean energy locations may put undue strain on precarious grids, creating energy equity issues. Or it may increase the water consumption of data centres in water-stressed regions.
Bigger models = better performance?
The rapid adoption of LLMs for everyday tasks has also generated a lot of interest in trying to understand their environmental impacts. On one hand, larger training datasets have been equated with better performance, but on the other, bigger models require more time, and therefore more GPUs to train, leading to higher energy consumption and emissions. In a survey of 95 ML models across time and different tasks in NLP and computer vision, authors found that the carbon intensity of the energy grid used, as well as the training time of the models, were the most influential factors in determining emissions. Asking if the higher emissions related to large models meant better performance, the authors found no clear correlation between higher CO2 emissions and better model performance.
Authors argue that the current bigger-is-better paradigm in AI is shaping research agendas towards economically and environmentally unsustainable ends while concentrating power in few hands who can afford massive computing power. They advocate for a “small is beautiful” approach to models, which balances resource efficiency with model performance. They emphasise that ML research should be transparent about computational costs, energy use, and memory requirements, using tools like CodeCarbon to track these metrics.
The rebound effect of efficiency gains
We are seeing innovations increasing energy efficiency across hardware and software emerge. However, primary drivers for investments in energy efficiency appear to be aimed at lowering operating costs and increasing energy autonomy rather than a commitment to mitigating environmental costs. So we have to ask… will the increased efficiency of hardware and models actually mitigate the environmental harms of AI compute or just ramp up production? The rebound effect of efficiency gains, in which increased efficiency can lead to increased consumption is a well-known issue in the energy sector, particularly in developing contexts. This author argues that the danger of this in AI development is very real. Cutting-edge, efficient hardware is expensive, and as model efficiency increases, we could see a resurgence of older hardware that powered the energy crises of early crypto mining.
Okay - where do we go from here?
Improving AI’s energy efficiency is an important goal, especially as AI is set to be adopted more widely for climate action. However, there is a need to re-evaluate priorities and mechanisms for measuring and making AI ‘more green.’ In an article about the Climate and Sustainability Implications of Generative AI, authors argue that we need to move beyond the narrow focus on energy and carbon efficiency in measuring AI’s impacts and adopt a lifecycle approach that examines comprehensive environmental and social impacts across stakeholder groups. They suggest a benefit-cost analysis framework for more sustainable AI at compute, application, and system levels as a starting point for Gen AI and industry community collaborations with broader stakeholder groups, such as CSOs, economists, social scientists, and supply chain groups.
Planter Box
How do we tackle the ecological polycrises we are currently facing? Michael Kwet suggests that time is not on our side — and that Digital Degrowth is the solution. He argues that under capitalism, our focus on limitless growth will never create the conditions for a just and equitable existence for all within planetary boundaries, and the drive for more is leading us to environmental/social collapse.
Taking a closer look at digital technology as a societal and economic construct; he urges us to see tech as not only shaping our day-to-day lives but also as a driver of global labour inequities. US tech supremacy controls social media, search engines, semiconductors, cloud computing systems, operating systems, business networking, office productivity software, and more. Meanwhile, the real (and exploitative) labour driving tech development is borne by the Global South; from physical risks of mineral extraction, underpaid roles of data annotators, and the mental health costs of social media content moderation.
In his book “Digital Degrowth: Technology in the Age of Survival”, he lays bare this unequal exchange between the global north and the global south — who owns the digital economy and its connection to environmental and human costs.
Around the Web
Carbon Tracker: Track and predict energy consumption and carbon footprint of training deep learning models.
CO2 Inference: Explore insights on energy efficiency and carbon emissions of AI models.
Green Software Patterns: An online open-source database of software patterns. Also, refer to an environmental Impact Framework for software.
Green Software for Practitioners: An introductory course by Linux for software practitioners to build, maintain, and run green applications.
Inside the miracle of modern chip manufacturing: Read about the design and innovation powering your tech devices.
Part 2 of the environmental costs of the AI lifecycle will be out next month. We examine Material AI with a focus on extraction of critical and rare earth minerals, AI’s e-waste problem, and the circular economy.
Tuning in…
In the latest episode of Code Green, listen to John Cotton from the Southeast Asia Energy Transition Partnership and Priya Donti from Climate Change AI as they delve into the complexities of energy transitions and machine learning in Asia. A sneak peek here!
Credits
Illustrations: Nayantara Surendranath | Art Direction: Tammanna Aurora | Layout Design: Shivranjana Rathore