Tackling AI’s Climate Change Problem

The AI industry could soon be one of the largest contributors to carbon emissions, if current trends continue.

Reading Time: 11 min 

Topics

Permissions and PDF Download

Traci Daberko

In an era defined by both the promise of technological innovation and the threat of climate change, artificial intelligence has emerged as both a valuable tool and a difficult challenge. As we use AI to tackle tough problems, we must also grapple with its hidden environmental costs and consider solutions that will allow us to harness its potential while mitigating its climate impact.

The success of OpenAI’s ChatGPT language model, which is backed by Microsoft, has sparked a technology arms race, with tech giants making enormous investments in building their own natural language processing systems. But the quest for more intelligent machines is quickly running into a web of sustainability challenges. AI has a fast-growing carbon footprint, stemming from its voracious appetite for energy and the carbon costs of manufacturing the hardware it uses. Since 2012, the most extensive AI training runs have been using exponentially more computing power, doubling every 3.4 months, on average.1

AI’s Environmental Costs

The environmental impact of information technology is often overlooked, even though data centers and transmission networks account for 1% to 1.5% of global electricity use. They also account for 0.6% of global carbon emissions, which need to be cut in half to achieve a net-zero emissions scenario by 2050, according to the International Energy Agency.2 A single average data center consumes the equivalent of heating 50,000 homes yearly. Electronic waste is the fastest-growing waste stream in the world, amounting to a staggering 57 million tons generated each year, about the same weight as the Great Wall of China.3

Several factors contribute to the carbon footprint of AI systems throughout their life cycles:

Large and complex models: Large language models (LLMs) require tens of thousands of cutting-edge high-performance chips for training and for responding to queries, leading to high energy consumption and carbon emissions. The greater the model’s complexity, the more task times increase, resulting in more energy consumption.4 LLMs like ChatGPT are among the most complex and computationally expensive AI models. The capabilities of OpenAI’s GPT-3 LLM are made possible by its 175 billion-parameter model, one of the largest when it was launched. Its training alone is estimated to have used 1.3 gigawatt-hours of energy (equivalent to 120 average U.S. households’ yearly consumption) and generated 552 tons in carbon emissions (equivalent to the yearly emissions of 120 U.S. cars).5 OpenAI’s latest model, GPT-4, is rumored to be 10 times larger.6

Data storage and processing: The data sets required to train and run AI models are large and complex, leading to high energy consumption and carbon emissions. Data storage, CPU operation, and chip operation consume most of the energy in data centers. Furthermore, around 40% of the electricity used in data centers is powering large air conditioners, which are necessary to keep servers cool and operating correctly. Falcon 180B, a recently launched open-access LLM, has 180 billion parameters (similar to GPT-3’s count) and was trained on a 3.5-trillion-token data set (compared with GPT-3’s 499 billion tokens).7 Training this model on such a large data set is estimated to have generated an estimated 1,870 tons of carbon emissions, equivalent to heating 350 U.S. households for a year, assuming a typical U.S. energy mix.8

Energy sources: The carbon intensity of the energy sources used to power AI systems determines their carbon footprint. Data centers that can draw on renewable energy sources can have lower carbon footprints than those that do not even if their energy consumption is similar.

Water consumption: The environmental impacts of AI and other information technologies go beyond carbon emissions. Data centers use large amounts of water in cooling towers and HVAC systems to prevent servers and other vital equipment from overheating. The intense computing and data requirements of AI models only increase data centers’ water consumption. Microsoft revealed in its most recent environmental report that its global water use increased 34% from 2021 to 2022 (to approximately 1.7 billion gallons, or more than 2,500 Olympic-size swimming pools). Google reported a 20% increase in water use during the same period, an increase that outside experts have linked to its AI development.9 This is a concerning trend for companies that have set ambitious environmental, social, and governance (ESG) targets that include being carbon-negative and water-positive by 2030.

Hardware: The production and disposal of AI hardware contribute to carbon emissions and the growing e-waste problem.10 The global volume of electronic waste is predicted to reach 120 million tons annually by 2050, double what it is today. The material value of the same e-waste — only 20% of which gets formally recycled — is approximately $62.5 billion.11 Recycling these resources and capturing more of that value could open the doors to a more robust, sustainable economy, minimizing the need to mine more of the materials used to manufacture them and keeping them out of landfills.

This doesn’t mean we should stop the development and use of AI models to protect the climate. Despite these significant environmental costs, AI is also proving to be a vital tool in promoting sustainability and addressing climate change. AI is being used to maximize the utilization of renewable energy sources like wind and solar electricity and to develop intelligent grids that balance energy supply and demand.12 AI-powered solutions are helping farmers increase agricultural yields while applying fewer pesticides and fertilizers, resulting in more environmentally friendly farming methods.13 And AI is being used to optimize logistics and reduce waste in supply chains, to monitor and enforce environmental regulations, and to optimize data center operations through machine learning algorithms that dynamically adjust temperature settings, workload distribution, and server utilization.14

Despite significant environmental costs, AI is proving to be a vital tool in promoting sustainability and addressing climate change.

AI’s contributions to solving the climate crisis can outweigh its negative climate impacts, but only if the AI industry adopts practices that emphasize ESG sustainability, makes sustainability central to its AI ethics guidelines, and actively seeks opportunities to reduce the environmental footprint of AI technologies. Users of AI must also be aware of the factors that contribute to the environmental impacts of these tools in order to guide their own use of it, and add sustainability to the list of criteria they use to evaluate AI vendors.

Transparency is critical, and reliable measurements of new models’ energy use and carbon emissions must be published to raise awareness of them and encourage AI developers to compete on model sustainability. Much of what we know regarding the carbon emissions and energy use of AI models is through estimates calculated by third parties rather than figures reported by the developers of the models themselves, and this must change. Tools available today, such as the Machine Learning Emissions Calculator, can help AI engineers simulate carbon emissions on AI models based on variables such as hardware, number of hours, provider, and location. Researchers at Google suggest exploring four technical best practices that they refer to as the 4Ms — model, machine, mechanization, and map optimization — that separately reduce energy and carbon emissions. They claim that following these best practices can reduce machine learning training energy by up to 100x and CO2 emissions by up to 1,000x.15

Relocate, Rightsize, and Re-Architect

While the 4Ms address AI energy use and carbon emissions from a technical standpoint, best practices for sustainable AI can be more broadly expressed as the three R’s: relocate, rightsize, and re-architect.

Relocate: Not all energy is created equal. We can mitigate the carbon emissions associated with AI’s energy consumption by transitioning to renewable energy sources such as solar or wind power. In the past 10 years, the cost of power from solar and wind has decreased by 89% and 70%, respectively, and it is now less expensive than alternatives using fossil fuels like coal and gas. Although the price of wind and solar has decreased dramatically, the biggest constraint remains the need for access to round-the-clock renewable energy sources. Placing computing workload in Quebec, Canada, where the access to renewable energy is almost 100% and the average carbon intensity is 32 grams per kilowatt-hour, can result in a sixteenfold reduction in carbon emissions compared with the U.S. average of 519 grams per kilowatt-hour.16 Some cloud vendors use carbon offsets to substantiate their net-zero claims, but this is not the same as running on carbon-free energy; it simply pushes the problem onto someone else.

Moving from on-premise to cloud-based computing can save on emissions and energy by 1.4x to 2x if it is well architected.17 Cloud-based data centers are custom-designed warehouses built for energy efficiency. The flexibility of the cloud also allows considerable freedom in choosing a workload’s location. Compare the PUE (power usage efficiency) values of on-premise and cloud-based data centers: The lower the value, the more efficient the center. One can also choose a data center that runs primarily on renewable energy sources.

Rightsize: Companies often use more computing and storage resources than they need. They can reduce their carbon footprints by rightsizing their AI models and applications and using adequate archiving procedures. Performance and energy efficiency can be increased by 2x to 5x when using processors and systems designed for machine learning training instead of running general-purpose servers not optimized for AI workloads.18 Optimization involves striking the ideal balance between the scope, model size, model quality, and efficient/sustainable resource use. Graphical processing unit manufacturers offer ways to limit how much power a GPU is allowed to draw, which can reduce energy consumption in exchange for slower performance — something that might be acceptable in many circumstances.19 Another strategy to consider is time shifting: performing demanding workloads, such as training periods, at times of day when the carbon intensity tends to be lower.

Re-architect: Building a well-functioning AI model requires a robust software/hardware architecture designed for scaling and fine-tuning the model while maintaining a low-latency response time. Choosing an effective machine learning model architecture, such as a sparse model, can improve machine learning quality while decreasing computation by 3x to 10x.20 Once an AI model has reached production, managing technical debt from a performance, security, and end-user experience perspective is crucial. Mismanaging or ignoring technical debt in favor of functional improvements can cause it to quickly accumulate and pose serious technology risks, including sluggish performance, poor-quality outputs, unanticipated downtime, data loss, or even security breaches. Only some applications or AI models can be redesigned. Still, when the chance arises, it’s critical to investigate more efficient machine learning model architectures to guarantee better quality while reducing computation.

Other AI Sustainability Practices

In addition to the three R’s, AI leaders must pay attention to the following as avenues to improve sustainability.

Data management: Digital data production is accelerating quickly. In 2022, the world generated an estimated 97 zettabytes, or 97 trillion gigabytes, of data. That figure might almost double, to 181 zettabytes, by 2025.21 Most of this data is generated for onetime use, never to be utilized again, yet gets saved on servers that take up space and use a lot of electricity. Responsible data management practices to reduce the amount of needlessly saved “dark data” are therefore essential for sustainable AI development and deployment. Larger models do not necessarily equate to better models; over time, performance will deteriorate. Energy consumption and environmental impact can be minimized by relocating data storage and processing to data centers that implement more energy-efficient cooling and are powered by renewable energy, and by implementing procedures for data compression, deduplication, and archiving dormant data.

Education and awareness: AI leaders can promote environmentally responsible AI practices by educating employees, partners, customers, and the public about the environmental impacts of AI. These impacts are not widely understood, and the easy availability of tools like ChatGPT can make using them seem equivalent to querying a search engine, even though a single ChatGPT query can generate 100 times more carbon than a regular Google search.22 The more that organizations and decision makers are attuned to the environmental consequences of AI, the more they will seek out solutions with lower environmental impacts and pressure AI providers to adopt more sustainable practices.

A single ChatGPT query can generate 100 times more carbon than a regular Google search.

Compliance: Leaders in AI will need to keep an eye on emerging regulations and best practices surrounding things like energy efficiency and e-waste management and their implications for investments in technology. The U.S.-based nonprofit trade association SustainableIT.org has released the first-ever standards tailored to IT’s impact on business sustainability. Both general and AI-focused legislation is emerging, such as the European Union’s Corporate Sustainability Directive and its Artificial Intelligence Act, which is expected to be finalized and go into effect in 2024. In the United States, the Securities and Exchange Commission’s Carbon Disclosure Rule is pending, and California recently passed two laws that will require companies to file annual public reports disclosing their direct, indirect, and supply chain greenhouse gas emissions, and to have them verified by an independent and experienced third-party provider.


The widespread adoption of generative AI models comes with an urgent need for all players in the industry, including managers and users of AI, to take greater responsibility for the environmental and social impacts of this promising technology.

Following the sustainable AI practices outlined in this article can help build a more sustainable AI ecosystem. By finding ways to minimize the energy and natural resources consumed by our AI development and deployment processes and drawing more attention to sustainability issues in discussions about AI, we can harness the power of this technology while minimizing its negative impact on our planet and society. 

Topics

References

1.AI and Compute,” OpenAI, May 16, 2018, https://openai.com.

2.Data Centers and Data Transmission Networks,” International Energy Agency, accessed Oct. 16, 2023, www.iea.org; and D. Patterson, J. Gonzalez, Q. Le, et al., “Carbon Emissions and Large Neural Network Training,” Arxiv, April 23, 2021, https://arxiv.org.

3. O. Rosane, “This Year’s E-Waste to Outweigh Great Wall of China,” World Economic Forum, Oct. 18, 2021, www.weforum.org.

4. R. Cho, “AI’s Growing Carbon Footprint,” State of the Planet, June 9, 2023, https://news.climate.columbia.edu.

5. Patterson, “Carbon Emissions and Large Neural Network Training.”

6. M. Schreiner, “GPT-4 Architecture, Datasets, Costs and More Leaked,” The Decoder, July 11, 2023, https://the-decoder.com.

7. P. Schmid, O. Sanseviero, P. Cuenca, et al., “Spread Your Wings: Falcon 180B Is Here,” Hugging Face, Sept. 6, 2023, https://huggingface.co.

8. This figure is based on the author’s calculations, given reasonable assumptions regarding GPU energy demands and energy mix.

9. M. O’Brien and H. Fingerhut, “Artificial Intelligence Technology Behind ChatGPT Was Built in Iowa — With a Lot of Water,” Associated Press, Sept. 9, 2023, https://apnews.com.

10. P. Dhar, “The Carbon Impact of Artificial Intelligence,” Nature Machine Intelligence 2, no. 8 (August 2020): 423-425.

11.A New Circular Vision for Electronics: Time for a Global Reboot,” PDF file (Geneva: World Economic Forum, January 2019), www.weforum.org.

12. E. Mehlum, D. Hischier, and M. Caine, “This Is How AI Will Accelerate the Energy Transition,” World Economic Forum, Sept. 1, 2021, www.weforum.org.

13. M. Javaid, A. Haleem, I. Haleem Khan, et al., “Understanding the Potential Applications of Artificial Intelligence in Agriculture Sector,” Advanced Agrochem 2, no. 1 (March 2023): 15-30.

14. Cho, “AI’s Growing Carbon Footprint”; and N. Sundberg, “Sustainable IT Playbook for Technology Leaders” (Birmingham, U.K.: Packt Publishing, 2022).

15. D. Patterson, J. Gonzalez, U. Hölzle, et al., “The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink,” Computer 55, no. 7 (July 2022): 18-28.

16. Sundberg, “Sustainable IT Playbook for Technology Leaders.”

17. Patterson et al., “The Carbon Footprint.”

18. Ibid.

19. K. Foy, “AI Models Are Devouring Energy. Tools to Reduce Consumption Are Here, if Data Centers Will Adopt,” MIT Lincoln Laboratory, Sept. 22, 2023, www.ll.mit.edu.

20. Patterson et al., “The Carbon Footprint.”

21. T. Jackson and I.R. Hodgkinson, “What Is ‘Dark Data’ and How Is It Adding to All of Our Carbon Footprints?” World Economic Forum, Oct. 5, 2022, www.weforum.org.

22. M. van Rijmenam, “Building a Greener Future: The Importance of Sustainable AI,” The Digital Speaker, Feb. 23, 2023, www.thedigitalspeaker.com.

Reprint #:

65227

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.