7 January 2025, updated 16 January 2025
AI is here to stay. And AI is famously compute-intensive, which means it’s energy-intensive, which means its carbon emissions are a worry. I think the emissions are likely to come from initial training of the model, running the model (inference), data centre stuff (mostly energy use), and lifecycle emissions associated with hardware production and disposal.
What’s being done to measure and reduce the emissions?
Stories and sources to read
Michael Liebreich’s Generative AI — The Power and the Glory - Dec24 is a great read:
- “According to EPRIa single ChatGPT query requires around 2.9 watt-hours, compared to just 0.3 watt-hours for a Google search, driving a potential order of magnitude more power demand. Even inference data centers will need to be 100MW or above. […]”A single rack of 72 Blackwell GPUs, along with its balance of system, will draw up to 120kW — as much as 100 US or 300 European homes.”
- “For the US, I expect data-center capacity will somewhat more than double by 2030, adding around 30GW, and the rest of the world will add no more than 15GW”
- and “This need not be bad for the climate. After all, if AI helps bring forward the electrification of heating, transport and industry by a single year, that would more than offset any negative climate impact from its own relatively limited power demand”
- “While all four hyperscalers say they remain committed to their net zero targets, the AI boom has made achieving those targets much harder. Their power use has more than doubled since 2020. Google has seen its carbon emissions increase by 48% since 2019 and Microsoft by 29% since 2020.”
- Nuclear: “The tech sector’s optimism bias is perfectly illustrated by the Institute for Progress (IFP), a non-partisan think tank focused on innovation policy, which claims that light-water SMRs can be built in six years at a first-of-a-kind (FOAK) power cost of $109/MWh, and an nth-of-a-kind (NOAK) power cost of $66/MWh […] I would be highly skeptical of any claim for a FOAK SMR under $180/MWh or a NOAK under $120/MWh before subsidies”
- hybrid supply: “I expect the tech titans to learn the same lesson as utilities have learned: relying on a purely fossil-based power supply will turn out more expensive than one which hybridizes cheap renewables and batteries with a little gas. It turns out there is a reason why 91% of all new power capacity added worldwide in 2023 was wind and solar, with just 6% gas or coal, and 3% nuclear.”
- utility/tech partnership: “It makes no sense for a data center and the local utility to invest separately in emergency backup resources. The data-center owner may want local backup to be sure it can ride through grid outages; the utility may want central backup to ride through periods of low wind or solar provision, or may need to add capacity to meet increasing demand. In 2016, Microsoft struck a deal with Black Hills Energy (formerly Cheyenne Light, Fuel and Power Co.) which gave it lower cost power in exchange for using its backup generators as a power resource for the grid as needed, eliminating the need for Black Hills Energy to build an additional plant. This sort of deal needs to become the norm, not the exception.”
Brian Potter’s energy cheat sheet - Dec24 looks like useful background reading.
AI’s emissions are about to skyrocket even further | MIT Technology Review - Dec24
- “For the 12 months ending August 2024, data centers were responsible for 105 million metric tons of CO2, accounting for 2.18% of national emissions (for comparison, domestic commercial airlines are responsible for about 131 million metric tons). About 4.59% of all the energy used in the US goes toward data centers, a figure that’s doubled since 2018.”
- Primarily about energy use in data centres, and links to this, which looks useful: Environmental Burden of United States Data Centers.
- Nb not all data centre use is AI, and data centres in the US are often located in parts of the country that have dirtier electricity generation.
Carbon Emissions in the Tailpipe of Generative AI - Jun24
- “We argue that the field must reframe the scope of machine learning research and development to include carbon and other resource considerations across the lifecycle and supply chain, rather than setting these aside or allowing them to remain on the field’s margins.”
Notes from my own emissions measurement - Feb24:
- “I can’t find any emission numbers published by OpenAI for ChatGPT :( . Many have made estimates, eg this, which estimates 24,860kgCO2e per day for 13m users with 5 queries/day in Feb 2023, or this which estimated electricity use at 77,160kWh per day. Both of those numbers are for the whole system - they don’t reflect your organisation’s use of an LLM. If ChatGPT has 13m daily users, then my estimated use is 24.86t/13m users * working days/5 because I use it about once a week. (For an LLM that does measure carbon, see Hugging Face BLOOM.)”
AI’s Growing Carbon Footprint — State of the Planet - Jun23
AI’s carbon footprint is bigger than you think | MIT Technology Review - Dec23
A Computer Scientist Breaks Down Generative AI’s Hefty Carbon Footprint | Scientific American - May23
Nature: The carbon impact of artificial intelligence - Aug20
Various stuff to look at here?
Google
Microsoft
OpenAI
- OpenAI - doesn’t disclose emissions but if you ask ChatGPT it will say “A single [text generation] query to ChatGPT might emit 1–5 grams of CO₂, depending on data center efficiency and energy sources. For perspective, sending an email with a large attachment generates about 50 grams of CO₂”
- and “As of now, OpenAI has not publicly released a detailed carbon emissions report or a specific carbon reduction plan. […] training GPT-3 was estimated to emit approximately 552 metric tons of CO₂, equivalent to the annual emissions of 123 gasoline-powered passenger vehicles.” I’d love to see the detail on those numbers.
- And “Each image request can consume 10–50 times more energy than a typical text query, depending on the image resolution and model architecture […] Emissions for generating a short video (a few seconds) could be hundreds of times higher than a text response” […] Generating a minute of high-quality audio may produce emissions roughly 10–20 times greater than a text query”
Other big tech
- Meta, Apple, Amazon, Nvidia etc - to find
Approaches to documenting existing emissions of LLMs
Other/questions
- Will carbon-aware software let LLMs etc be trained during periods of lower-emission energy?
- If the AI action moves to inference, does that change the story? If the inference is at the edge, will we see consumer hardware cos update their Product Environmental Reports to include the emissions of expected AI workload?
- Will AI drive adoption of renewables? If you squint, big tech are pushing renewable energy and doing nuclear energy deals because they have rapidly increasing data centre energy use, some of which is AI. But you do have to squint, because eg Google missed its emissions targets in 2024 thanks to AI/data centre investment and use.)
- Water use strongly correlated with energy use in AI, eg here
- More energy efficient hardware/chips might reduce future emissions, eg Vaire and “reversible computing”.