World on Fire
A semi-regular reminder that there is a physical cost to AI
Last week, I spent the better part of a day running the same set of legal documents through five AI models, rerunning things that kept breaking, and made something like forty-five API calls when I'd meant to make fifteen. At some point I stopped and thought — not for the first time over the years — how much water did I just use?
The most-cited figure comes from researchers at UC Riverside: around 500 milliliters per 20 to 50 short queries. Mine weren't short. Scaling up by compute time puts my session somewhere between half a liter and a few liters of water, plus some amount of carbon I can’t really tell you because Anthropic doesn't publish model-specific environmental data, and neither does Google for Gemini, and neither does OpenAI in any form granular enough to calculate. The opacity is a choice, and the companies running the infrastructure have decided that you don't need to know what your queries cost, because knowing would create a problem for them. Unfortunately for them, people are tired of not knowing.
The People
Last month, four companies approached Seattle City Light about building five large data centers in the city, combined maximum power demand of 369 megawatts, roughly a third of what Seattle uses on an average day. Seattle City Light is a public utility that’s already told its customers to expect annual rate increases of 7 to 10 percent for the foreseeable future, partly to fund grid upgrades, partly because the region's hydropower supply is already maxed out and utilities are supplementing on the open market, which largely means natural gas. City council members received 54,000 messages in a matter of days. Two developers pulled out, then three council members announced a one-year moratorium on new data center siting, backed by Mayor Wilson. People understood the audacity of what was being proposed: a massive new load on a grid they depend on, to serve infrastructure they wouldn't benefit from, at a cost that would show up on their bill.
Wisconsin, where I'm from, is several chapters deeper into the same story. The state has six large data center projects currently underway or recently approved. In Port Washington, a city of about 13,000, a $15 billion campus is being built on 672 acres of what was farmland, to be operated by Oracle on behalf of OpenAI. Trucks have been carting away dirt around the clock to level the site. The mayor who approved it is now the target of a recall campaign.
In Beaver Dam, WI, Meta is building a hyperscale facility expected to consume six to eight times the power of the entire city. Democratic lawmakers have proposed a statewide construction moratorium and residents from multiple cities rallied at the Capitol. The Wisconsin Public Service Commission ruled this week, in the first decision of its kind in the Midwest, that large data centers must cover the full cost of any new generation infrastructure built to serve them. The PSC chair, during the Meta-Alliant Energy hearing, said she didn't understand why achieving basic transparency had needed to be so difficult.
The data center buildout in Washington state is projected to require somewhere between two and four times Seattle's current electricity use by 2029, yet the state ranks dead last in the country for producing new renewable energy infrastructure. A county in eastern Washington recently approved a temporary natural gas plant specifically to power a new data center. In case we’ve forgotten, there is a huge physical cost to AI, and the beast is only getting hungrier. The cost that can’t be calculated at the individual level (my little experiments, your everyday queries) aggregates, physically and geographically, into places like Port Washington and Beaver Dam and Seattle, where people are being asked to absorb it without having been consulted and without receiving much in return.
The Money
The five biggest hyperscalers, Amazon, Microsoft, Google, Meta, and Oracle, are on track to spend over $600 billion on infrastructure in 2026, a 36% increase from 2025, with roughly 75% of that going to AI. This number is staggering enough on its own. The really weird (dumb) thing is that the primary customers for all this infrastructure are companies that are losing money at a scale also without precedent. OpenAI projects spending $121 billion on compute in 2028 alone, and in that same year projects losses of $85 billion, roughly three-quarters of its anticipated revenue. The company doesn’t expect to break even until after 2030, yet we’re all giving it the grace of a trust fund kid that just hasn’t figured its life out.
OpenAI has committed, per Sam Altman's own public statements, to as much as $1.4 trillion in infrastructure spending over the next eight years. The hyperscalers have raised at least $200 billion in AI-related debt in 2025 alone, almost definitely a massive undercount since many deals are private, with JPMorgan projecting $300 billion in AI and data center debt deals annually for the next five years. Bank of America found that hyperscalers would need to spend 94% of their operating cash flow to fund their AI buildouts, which is why they're turning to debt markets at a volume and pace that led analysts at Morgan Stanley to project up to $1.5 trillion in new tech sector debt issuance in the coming years.
To understand what's actually happening here, it helps to look at where the money goes. As Ed Zitron has documented exhaustively in his newsletter: AI startups are losing money. The money they raise flows to compute providers, the Anthropics and OpenAIs, who are also losing money. That money flows to the hyperscalers, Amazon, Google, Microsoft, who rent out the infrastructure and are, so far, generating revenue from doing so. But the hyperscalers are themselves borrowing money at historic rates to build the data centers that the unprofitable AI companies rent from them to serve the unprofitable AI startups building on top of them. The rotting circular nature of this isn’t exactly hidden, but it rarely gets said plainly: an enormous proportion of AI "demand" is AI companies paying each other with capital raised on the promise of future demand that has not yet materialized. By design, this all has nothing to do with you and is none of your business.
Has any technology in human history been scaled at this pace and at this cost on this thin a foundation of demonstrated demand? The railroad booms of the 19th century involved massive overbuilding and a wave of bankruptcies, but railroads moved physical goods and created immediate, legible economic value; the infrastructure, even after the companies went bust, remained useful. The dot-com buildout of the late 90s wasted enormous capital, but the fiber optic cables that were laid in the ground eventually carried the internet. The telecom bubble is perhaps the closest analogy: vast infrastructure built in anticipation of demand that didn't arrive on schedule, leaving highly leveraged companies stranded when debt markets closed. But even there, the product was clearly useful, the business model was understood, and the technology had an established track record. Generative AI has none of those things working cleanly in its favor. The product's value at the scale being bet on remains hotly contested. The business models are loss-making at every level of the stack. The companies building the infrastructure are doing so on junk-rated debt against demand projections that require the entire industry to continue to grow in synchrony, none of the major bets to fail, and costs to fall in ways they have not yet fallen.
Waterfall
What makes this extraordinary from an environmental standpoint is that the physical costs aren’t hypothetical. The farmland in Port Washington is already gone. The gas plant in eastern Washington is being built, the water is being consumed in Iowa and Virginia and Arizona, in amounts that nobody will calculate for you, because why would they. These are irreversible costs being incurred right now in service of financial projections that analysts at the institutions funding them describe with phrases like "you have to turn over all avenues to make this work." The communities absorbing those costs didn't get a vote on the projections. They're not going to receive a refund if OpenAI doesn't hit its 2030 breakeven. The 54,000 people who wrote to the Seattle City Council weren’t objecting to AI in the abstract, they were objecting to being asked to pay for an industrial buildout predicated on economics that wouldn't survive a high school accounting class, out of their own power budgets, without anyone asking. As they should!
The opacity about environmental cost is, in this light, not incidental to the financial opacity. They're the same instinct. Don't let people see what this costs, per query or per annum, because if they could see it clearly, they'd ask whether it made sense, and that question is one the industry can’t answer without sweating. So, I ask: make it make sense.



In related news:
1 - I recently read that fully half of Google and Amazon's reported profits for last quarter are actually just the increased valuation of their shares in Anthropic. WHAT. https://fortune.com/2026/04/30/google-amazon-ai-profits-anthropic-stake-bubble-earnings-2026/
2 - Here in Aotearoa, Datagrid has proposed a massive data center that's expected to use the equivalent of 6 percent of the *entire country's* energy usage. There's a good amount of renewable energy here, to be fair, but still--holy mackerel. https://www.rnz.co.nz/news/science-and-technology/594416/nz-s-ai-data-centre-boom-who-benefits-from-the-build-out
I'd like to recommend a book that talks about these costs: Sustainable Content by Alisa Bonsignore. Highly recommend.