AI is not invisible. Training GPT-3 consumed 1,287 megawatt-hours and produced 552 tonnes of CO2. A single ChatGPT query uses nearly ten times the electricity of a Google search. Data centers already account for 4.4% of all US electricity, a share that could triple by 2028. A November 2025 paper in Nature Sustainability projects US AI deployments alone will generate between 731 and 1,125 million cubic meters of water consumption annually through 2030, alongside 24 to 44 million tonnes of CO2 equivalent per year. VU Amsterdam puts 2025 AI water use on par with all bottled water consumed globally in a single year.

The hardware story is worse, and almost nobody is telling it. Manufacturing one high-end GPU produces roughly 200 kg of CO2. Data centers deploy these in the thousands. AI-specific chips become obsolete in two to three years, half the lifespan of standard server hardware. Research in Nature Computational Science estimates generative AI will contribute between 1.2 and 5 million metric tonnes of e-waste by 2030. The world already generated 62 million metric tonnes in 2022, and only 22% was properly recycled. AI hardware, dense with rare earth elements and flagged for data security concerns, is harder to process than standard electronics. The infrastructure to handle it does not exist yet.

This piece is worth reading in full because it does not stop at carbon counts. It traces the footprint from rare earth mining through manufacturing, deployment, water draw, and e-waste, building a complete ledger that most coverage ignores. The numbers compound in ways the summary cannot fully capture, and the geographic specifics of where water-hungry data centers are being built make the accountability question concrete, not abstract.

[READ ORIGINAL →]