As artificial intelligence advances at breakneck pace, the world’s biggest tech companies face growing pressure to address the environmental and climate impacts of AI - which requires large amounts of electricity and water to run.
Tech giants such as Amazon, Google and Microsoft have pledged to tackle the climate crisis, yet green experts say the sector is not doing enough to mitigate the rising consumption of resources.
For example, the issue was evident in Google’s latest Environmental Report, which shows that the company consumed 5.6 billion gallons of water in 2022, an increase of 1.3 billion from 2021, and 2.2 billion from 2020. The 2022 usage equates to about 10 days’ worth of water for the entire city of London.
Google’s ‘Scope 2’ greenhouse gas emissions - those coming from the buying of power and heat - rose 37 per cent from 2021 to 2022.
Here’s how the growth of AI and tech firms’ resource use are raising concerns about water shortages and global warming:
How much water are tech giants using as AI evolves?
The development of AI by the tech sector has led to a huge rise in water use. Training models involves feeding vast amounts of data into algorithms called Large Language Models (LLMs), which are computationally intensive and need powerful hardware.
Microsoft consumed nearly 1.7 billion gallons of water from its operations in 2022, an increase of about 34 per cent from 2021, the company wrote in its latest sustainability report.
We need to go from viewing energy efficiency (and) lower carbon footprint impact as an ‘added value’ to making them a first order constraint for any computer system, especially for large-scale data centres.
Ayse Coskun, professor, Boston University
Research from University of California, Riverside in April found that just training GPT-3, the language model that was used to power OpenAI’s ChatGPT, in Microsoft’s US data centres consumed 700,000 liters (154,000 gallons) of clean freshwater.
“ChatGPT needs to ‘drink’ a 500ml bottle of water for a simple conversation of roughly 20-50 questions and answers, depending on when and where ChatGPT is deployed”, the researchers wrote.
“All these numbers are likely to increase by multiple times for the newly-launched GPT-4 that has a significantly larger model size,” they said.
The tech industry’s intensive water use comes as global demand soars and supplies dwindle.
The United Nations has predicted that the need for water will exceed supply by 40 per cent by 2030, and estimated that the number of people in cities facing water scarcity will rise from 930 million in 2016 to between 1.7 and 2.4 billion people in 2050.
Google plans to build a data centre in Uruguay’s capital Montevideo, fuelling fears about water consumption given that the country is facing its worst drought in 74 years.
A spokesperson for the search giant told Context that the “data centre project is still in the exploratory phase, and our technical team is actively working with the support of national and local authorities to get this right”.
By 2030, the company aims to replenish 120 per cent of the freshwater it consumes by investing in projects which restore or conserve resources in watersheds, like rivers or lakes. As of 2022, it was replenishing only 6 per cent, according to its report.
How is AI requiring ever-more energy?
Developing AI has come at greater computing expense, putting a spotlight on energy consumption and the ensuing emissions.
OpenAI, which partnered with Microsoft on ChatGPT and other models, estimates that the amount of compute used in the largest AI training runs has soared more than 300,000-fold since 2012.
Amazon, Microsoft, Google, and Meta more than doubled their combined energy use between 2017 and 2021, rising to around 72 terawatt-hours (TWh) in 2021, according to the International Energy Agency (IEA). That is equivalent to approximately one quarter of all the energy used by the United Kingdom in 2022.
The information and communication technology (ICT) sector emits between 2 per cent and 4 per cent of all the carbon emissions produced each year, according to 2020 research from Lancaster University.
“That’s not insignificant, but it’s also not apocalyptic … it’s a ‘medium sized problem‘,” Anne Pasek, a technology and climate researcher at Trent University in Canada, wrote in a recently-published online magazine about data centres.
“But whatever way you look at it, if the sector wants to keep pace with the wider climate commitments of the Paris Agreement, it will need to reverse course and reduce emissions,” she added. “That requires changing norms and habits — probably both for consumers and industry players.”
What are the possible solutions?
AI could see a reduction in its use after companies have finished experimenting with new tools such as ChatGPT, said Ayse Coskun, an engineering professor at Boston University.
They could then determine which areas require complex models, and where more simple ones would suffice, she said.
“People have started to think about that: ‘Do I need to really throw a large hammer at this little nail, when maybe I can just use a screwdriver?’,” Coskun told Context.
Yet more radical approaches might be necessary to ensure companies develop in line with climate goals, experts say.
“We need to go from viewing energy efficiency (and) lower carbon footprint impact as an ‘added value’ to making them a first order constraint for any computer system, especially for large-scale data centres”, Coskun added.
“We need accountability, more transparent reporting of carbon impacts, and more innovation to optimise energy as a whole system.”
This story was published with permission from Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers humanitarian news, climate change, resilience, women’s rights, trafficking and property rights. Visit https://www.context.news/.
Did you find this article useful? Join the EB Circle!
Your support helps keep our journalism independent and our content free for everyone to read. Join our community here.