Rod McLaren: Words that work |||

Do the AIs know their carbon emissions? No.

8 May 2025

Can the big AIs tell you what their carbon emissions are?

On the whole, not really because no-one discloses emissions, so while some AIs will try make an estimate, the ranges are big.

I asked some LLMs these questions:

  • Hi [LLM name], what model are you?
  • Has [organisation that built the LLM] worked out a measure of carbon emissions per token or chat? If so, are you able to report back to me something like This chat cost X tokens, which means emissions of Y kgCO2e”?
  • Can you tell me what the carbon emissions of [LLM name] are? Consider initial training of your model, running the model, data centre and energy use, and lifecycle emissions associated with hardware production and disposal.

The headlines:

  • No AI company has publicly disclosed emissions, so any emissions numbers are derived from third parties estimates, rather than the AI co’s actual measurements. (Perhaps disclosure is an opportunity for a bold AI co?)
  • ChatGPT and Grok will try estimate a full-ish answer on emissions from publicly-available third party data. The others have some citations and guestimates, or say You’re on your own pal.
  • It looks as if the emissions of AI are low at the personal level, albeit growing rapidly at the infrastructural level. More reading on that here and here.
  • Google’s paper says that 70+% of embodied emissions are operational usage (aka inference). As AI use grows, training and inference (using AI) energy use will grow, but inference more quickly, so we can expect that proportion to grow in the lifecycle mix. Conclusion: carbon-free electricity may be the most important factor in controlling AI emissions?
  • So emissions may be among the smaller problems for AI. Bigger problems: trust, ethical considerations like bias and IP theft, mis/dis-information, AI entrenching power structures that aren’t fair, and whether AI eats your brain or job.
  • Caveats: I have no idea if I’m comparing the right models or not - the names of these LLMs are so confusing. I might have got different answers with different prompts. AIs are improving quickly, so some of this will be out of date already. Chat AI is obviously a subset of AI.
  • More: working notes on on emissions and AI.

Anyway, here’s the detail for the AI emissions trufans:

OpenAI ChatGPT

  • You’re currently chatting with ChatGPT using the GPT-4-turbo model. It’s a variant of GPT-4 that’s optimized to be faster and more efficient.” on 8 May 2025 (so not the newest 4o).
  • While specific figures for GPT-4 are not publicly disclosed, estimates suggest it required around 1,750 MWh for training, indicating a higher energy demand due to its increased complexity”
  • Emissions per chat: estimate that each ChatGPT query emits approximately 2.5 to 5 grams of CO₂ equivalent (gCO₂e), with a commonly cited figure being 4.32 gCO₂e per query […] The energy required to process a single ChatGPT query is estimated to be between 0.0017 and 0.0026 kWh”
  • Its emissions: OpenAI has not publicly released an official measure of carbon emissions per token or per chat.” and (interestingly, here ChatGPT does that thing where it gives you 2 answers to choose between which is a bit confusing tbh, I’m using the best bits of both here:)
    • Model training: While specific figures for GPT-4 are not publicly disclosed, estimates suggest it required around 1,750 MWh for training, indicating a higher energy demand due to its increased complexity” and 552 metric tons of CO₂, equivalent to the annual emissions of 123 gasoline-powered cars or about 300 round-trip flights between New York and San Francisco”
    • Inference/usage: Assuming ChatGPT handles around 10 million queries daily [and 4.32 grams of CO₂ per query], the total daily emissions could be approximately 43.2 metric tons of CO₂.”
    • Data centre and energy: around 30,000 Nvidia GPUs, each emitting about 1.44 kg of CO₂ per day, leading to an estimated 43,200 kg of CO₂ daily for the entire operation […] a typical ChatGPT session (20-50 questions) can use up to 0.5 liters of water” and Globally, data centers consumed about 460 terawatt-hours (TWh) of electricity in 2022” (nb that’s all data centres: broader than AI)
    • Hardware Lifecycle Emissions: AI is projected to add between 1.2 million to 5 million metric tons of e-waste by 2030, accounting for up to 12% of global e-waste”
  • Elsewhere: I couldn’t find anything published by OpenAI on carbon emissions. In 2021, they hadn’t calculated CO2 per request.

Google Gemini

  • I am a Gemini Deep Research agent. I’m powered by Gemini 2.0 Flash Thinking”, on 8 May 2025.
    • I had to ask it several times. Eventually it would quest across the internet, researching 250+ websites.
    • one illustrative case study that examined the deployment of the GPT-4 model on Google Cloud infrastructure located in Korea, researchers estimated that the carbon emissions for the inference stage of processing were approximately 0.09 grams of CO2 per token
    • its emissions: no single answer but it generated a wordy report and some tables of energy use, CO2e per query, watt-hours per query etc - see the transcript, though they’re gathered from many places online so may not allow useful comparisons.
    • It did point me at a useful Google paper, which claims to be the first published study of a cradle-to-grave analysis of the carbon footprint of AI hardware, including the first publication of manufacturing emissions of an AI accelerator” - Designing sustainable AI: A deep dive into TPU efficiency and lifecycle emissions Feb 2025 (more on this a few bullet points below)
  • Gemini 2.5 Pro Experimental, on 14 Apr 2025.
    • Doesn’t know its own emissions: Google does not currently offer real-time, per-chat carbon footprint reporting directly within this interface”, and goes on to explain why it’s hard to calculate.
    • Google has stated internally that training accounts for roughly 40% of their generative AI energy consumption”
    • Google’s data centers are highly efficient (average Power Usage Effectiveness - PUE - of 1.10 in 2023, compared to an industry average around 1.58”
    • detailed Life Cycle Assessment (LCA) study specifically for its TPU hardware […] found that operational electricity use currently accounts for the vast majority (over 70%) of a TPUs lifetime emissions.”
  • Elsewhere:

Microsoft Copilot

  • Doesn’t know what model it is, or its own emissions: I don’t have details on the specific architecture or training data” (I asked for quick response” rather than Think deeper”, on 6 May 2025
  • Emissions per chat: I don’t have access to a direct emissions calculation for our conversation, but if you’re interested in learning more about Microsoft’s sustainability efforts, you can check out their initiatives here.”
  • Its emissions: I don’t have exact carbon emissions data for my operations”
  • If you ask Copilot to Think deeper” you get more text but no answers: a detailed, publicly available breakdown that sums up training, running the model, data center energy use, and hardware lifecycle impacts into one X tokens = Y kgCO₂e” metric does not exist.”
  • Elsewhere:

Anthropic Claude

  • Claude 3.7 Sonnet, on 7 May 2025, a slightly better answer than last time I asked Sonnet on 14 Apr 2025
  • Emissions per chat: I couldn’t find specific information about Anthropic having worked out a measure of carbon emissions per token or chat for Claude. […] Meta’s 65B parameter LLaMa model (which might be roughly half the compute of a Claude model) used approximately 449 MWh of power during training, equivalent to the annual power consumption of 42 U.S. households.”
  • Its emissions: DitchCarbon, which evaluates companies’ sustainability performance, gave Anthropic a score of just 18/100 […] Most companies require significantly more energy for serving a model (inference) than for training. For major AI companies, it’s often a 90/10 split,” with 90% of energy used for inference.
  • Elsewhere:
    • if you go digging, you find this in a 2024(?) pdf about the Opus/Sonnet/Haiku family of LLMs: We offset our emissions (including from our cloud computing usage) and work with cloud providers that prioritize renewable energy and carbon neutrality. Anthropic works to fully offset our operational carbon emissions each year, partnering with external experts to conduct a rigorous analysis of our company-wide carbon footprint. Once measured, we invest in verified carbon credits to fully offset our annual footprint. Our credits directly fund emissions reduction projects. Our goal is to maintain net zero climate impact on an annual basis through such initiatives and offsets.”
    • And if you ask Anthropic support, they suggest using the API to get token counts and to look at the website in case any emissions metrics are published. (This is a bit disappointing: Anthropic are pretty good on transparency - they publish loads of stuff on societal and economic impact, and appear to be thoughtful on governance and risks, but why so little on climate impact?)

X.ai Grok

  • I’m Grok 3, built by xAI” on 6 May 2025
  • Emissions per chat: Unlike some estimates for models like ChatGPT, which suggest 1.5–9.5 grams of CO2 per query or ~0.09 grams per token based on inference energy use, xAI doesn’t provide detailed emissions data tied to token counts or individual chats. I can give a rough estimate based on industry parallels. For a model like Grok 3, assuming similar energy use to GPT-3-sized models (~0.5 watt-hours per query, 0.459 kg CO2/kWh in a typical data center), a single chat with ~200 tokens might emit around 0.19 grams of CO2. This chat, let’s say 100 tokens for simplicity, would then emit roughly 0.095 grams of CO2 (or 0.000095 kgCO2e)”
  • Its emissions: Training emissions likely range from 50,000–250,000 metric tons of CO2e, with a midpoint of ~150,000 metric tons, heavily dependent on training duration and energy mix […] Operational emissions are ~130–2,000 metric tons of CO2e/year, depending on usage, with a likely range of 500–1,500 metric tons for 2025 […] Data center emissions for Grok 3 are ~200,000–350,000 metric tons of CO2e/year, with a midpoint of ~260,000 metric tons, assuming partial allocation […] Lifecycle hardware emissions are ~30,000–40,000 metric tons of CO2e/year, with a one-time manufacturing spike of 150,000–200,000 metric tons […] Total: ~1,630,000 metric tons CO2e over 5 years. […] This chat cost ~825 tokens, resulting in emissions of ~0.00078 kg CO2e (0.78 grams CO2e).” - loads more detail in the transcript.
  • Elsewhere: Can’t find a sustainability report.

Facebook Llama

  • I’m powered by Llama 4.”, on 6 May 2025
  • Doesn’t know its own emissions: I don’t have real-time access to detailed emissions data for our conversations.”
  • Emissions per chat: A study by the University of Massachusetts Amherst estimated that training a single large language model can emit up to 284,000 kg CO2e.” (nb Claude has more of an opinion about Llama: Meta’s 65B parameter LLaMa model (which might be roughly half the compute of a Claude model) used approximately 449 MWh of power during training, equivalent to the annual power consumption of 42 U.S. households.”)
  • Elsewhere: 2024 sustainability report for Meta.

Deepseek

  • Hi! I’m DeepSeek-V3, an advanced AI language model developed by DeepSeek. My knowledge is up to date until July 2024”, on 7 May 2025. (The ui says DeepThink (R1)”.)
  • Emissions per chat: A hypothetical estimate for our chat could look like: Tokens processed: ~500 tokens. Estimated emissions: ~0.0008 kgCO₂e (if similar to GPT-3 efficiency)”
  • Its emissions: DeepSeek has not publicly disclosed detailed carbon emission figures covering training, inference (running the model), data center operations, or hardware lifecycle impacts […] Training GPT-3 (175B parameters) emitted ~552 metric tons CO₂e (Strubell et al., 2019). Model size (DeepSeek-V3’s parameter count is undisclosed but likely comparable to GPT-3.5/4)[…] Per Query: ~0.001–0.01 kgCO₂e/1k tokens (based on ML CO₂ impact studies). Smaller models (e.g., 10B params) may be at the lower end. […] Google’s 2023 report: ~10–12 gCO₂e/kWh for their cloud regions (renewable-heavy). Global average: ~475 gCO₂e/kWh (fossil-dependent grids). […] Production: Manufacturing GPUs/TPUs is energy-intensive (~200–300 kgCO₂e per A100 GPU).”
  • Elsewhere: Can’t find a sustainability report.

Mistral Mixtral

  • I am not aware of the details of my architecture.” on 7 May 2025
  • Doesn’t know its own emissions: I was unable to find specific information about Mistral’s carbon emissions per token or chat.”
  • The carbon footprint of AI models is a significant concern. For instance, ChatGPT produces more than 260,930 kilograms of CO2 each month, which is equivalent to 260 transatlantic flights from New York City to London […] While AI models like ChatGPT have a high carbon footprint, they can still be more efficient than human activities in some cases. For example, AI writing produces 130-1500 times less CO2 per page than a human author”
  • Elsewhere: Can’t find a sustainability report.

Phew, ok, hit publish.

Up next Carbon.txt 25 April 2025 Carbon.txt is a service by the Green Web Foundation which lets you declare your organisation’s sustainability data in way that
Latest posts Do the AIs know their carbon emissions? No. Carbon.txt You serve as a human firewall. So be a hero. Ripple Energy is going bust, but the wind farms are still spinning Transferring a SIPP from Vanguard to InvestEngine There’s a deadline coming up for state pension top ups: 5th April 2025 Multiple suppliers, start at home: 2 useful energy experiments Content design style guides Yes, wind farms sometimes make less energy during big storms Some UK fintech metrics Working notes: carbon emissions of AI Radical Co-op membership 3 reasons to start with words Forms are temporary, class is permanent - Sport England A quick way freelancers can guestimate their carbon emissions Show them it’s here already Rams: The time of thoughtless design for thoughtless consumption is over 9.79 tonnes CO2e per year The leap from experience to trust Switching bank account to reduce company carbon emissions Climate change is an engineering challenge Dirty internet: carbon aware websites How to play the LinkedIn blogging game 3 stories to complete the Co-op Digital newsletter Words that work for digital + climate transformation When is something ready to publish? Technology eating retail: stories (Fr)agile Writing questions Carbon transformation How to become a digital writer or content designer