17 June 2025
Last week’s AI emissions news:
Very bad: xAI data centre methane emissions - regulatory loopholes let x.AI run methane-burning generators to power their data centres. And they leak unburned methane, which traps much, much more heat than carbon dioxide. Also here.
Needs disclosure: Altman hints at some OpenAI inference numbers - “the average query uses about 0.34 watt-hours [and] about 0.000085 gallons of water”, but the internet is pretty skeptical - just one example.
Good: Nvidia releases some information on its H100 server emissions - but bear in mind it’s cradle-to-gate, so only part of the story: “This PCF summary intentionally excludes use-phase and end-of-life emissions due to the variability in those emissions based on customer usage”.
This would all be a lot easier if AI platforms published their detailed lifecycle emissions numbers in the open. I think that the first platform brave enough to do this will get reputational points for transparency.