If AI Is the Deal, the Surcharges Are the Catch. (Time to Read the Fine Print)
by Kent O. Bhupathi
Michael Burry has a knack for walking into the party right when the music is loudest and asking where the fire exits are.
In 2005, that meant shorting the U.S. housing market while everyone else was busy securitizing granite countertops. Today, it means betting against the AI trade while the rest of us are delighting in auto-drafted emails and AI-polished pitch decks. On paper, he is “short AI.” In practice, his long game is quieter and more elemental: water rights, water-rich farmland, water utilities.
He is betting on scarcity against a story that pretends resources are infinite.
I am, uncomfortably, part of the audience for that bet. I am a heavy AI user. I hand questions, image prompts, and half-formed medical anxieties to these systems without thinking twice about what it costs to run them. I have spent plenty of time engaging with the labor side of AI’s impact. I have spent almost no time thinking about the electricity needed to power my queries or the water boiled off to keep the servers cool.
Burry’s latest move finally forced me to look. If the man who made his name spotting hidden risks is now lining up against the AI trade while quietly hoarding water, it may be time to admit that the AI story is as much about substations, transmission lines, cooling towers, and aquifers as it is about code and chips.
The core argument of this piece is simple: modern AI is built on top of scarce utilities, especially power and water. Its true social and economic costs may be rising faster than investors, users, and even regulators appreciate. That gap between story and physics is where bubble risk lives.
AI Is Not “Weightless”
We talk about AI as if it lives in the clouds, floating somewhere above the material world. In reality, it lives right here on Earth, in data centers: warehouses full of servers that burn electricity and shed heat all day.
U.S. data centers consumed about 183 terawatt-hours of electricity in 2024, a bit more than 4% of total U.S. demand (so, roughly 17.4 million U.S. households for a full year). That number is projected to more than double to 426 TWh by 2030. A single AI-focused hyperscale data center already uses as much electricity as 100,000 U.S. households, and the largest facilities under construction could draw twenty times that, rivaling the load of 2 million homes.
And the load is not evenly spread. In 2023, data centers accounted for about 26% of all electricity use in Virginia, with double-digit shares in North Dakota, Nebraska, Iowa, and Oregon. That kind of concentration does not just show up in a sustainability report; but isolated to grid-planning meetings and rate cases.
Water tells a similar story. A typical U.S. data center uses around 300,000 gallons of water per day for cooling, roughly the daily needs of a thousand households. Hyperscale AI campuses can guzzle 5 million gallons per day, about as much as a town of 50,000 people. Nationally, U.S. data centers directly consumed about 17 billion gallons of water in 2023 just for cooling, with hyperscale and colocation centers making up 84% of that total.
In other words, this “cloud” is very thirsty… and it shows no sign of being quenched.
When Your Chatbot Raises My Power Bill
To a grid operator, AI is not a vibe. It is a sudden wall of load.
Because spare capacity in many regions has largely disappeared, every new gigawatt of data-center demand now has to be matched by new generation and transmission capacity. The result is that the cost of serving AI does not stay neatly on tech-company balance sheets.
A recent analysis in the PJM power region, which covers parts of the Mid-Atlantic, found that data-center growth will add an estimated $9.3 billion in capacity costs for 2025–26. That translates into $16 to $18 more per month on residential electric bills in some states. A Carnegie Mellon study projects that by 2030, data centers and crypto mining together could push average U.S. electricity bills 8% higher, with increases exceeding 25% in high-demand areas such as Virginia.
From an economics perspective, this is a textbook externality:
Private firms decide to build AI capacity.
Utilities scramble to supply them.
The incremental cost of new plants and lines gets smeared across every ratepayer’s bill unless regulators deliberately ring-fence it.
It’s a 1-2-3 combo! And the energy use is not just about one-off model training.
Training a single large transformer model (e.g., ChatGPT) can emit more than 626,000 pounds of CO₂, roughly the lifetime emissions of five cars. Now old enough for thorough study, OpenAI’s GPT-3, for example, with 175 billion parameters, required about 1,287 megawatt-hours of electricity to train, yielding an estimated 502 metric tons of CO₂. But the bigger story is what happens after training. Google has estimated that about 60% of AI’s overall energy use comes from inference, not training.
Every autocomplete suggestion and AI-assisted search result keeps the meter running…
If large-scale AI search multiplies energy used per query by up to five times, as some estimates suggest, then “AI everywhere” quietly becomes “higher load everywhere.” It is a recurring subscription charged to the grid.
The Hidden Thirst of AI
Electricity gets more attention because it shows up in emissions inventories. Water is quieter and, in many places, more politically volatile.
Cooling AI data centers relies heavily on fresh water, both on-site and at the power plants that generate their electricity. A Department of Energy–backed study projects that hyperscale AI data centers alone will use 16 to 33 billion gallons of water annually by 2028. Another analysis foresees water used for data-center cooling rising by 870% in coming years as more AI facilities come online. That figure actually keeps me up at night!
And most of that water is consumed, not returned. So, let that sink in…
An OECD analysis cited in recent estimates that each kilowatt-hour of energy used for AI can require 1.8 to 12 liters of water for cooling and power, depending on location and efficiency. Meeting global AI demand could require 4.2 to 6.6 billion cubic meters of water withdrawal in 2027. If roughly half of AI activity occurs in the United States, that implies 0.5 to 0.7% of total U.S. annual water withdrawals devoted just to AI operations within a few years. The percentage may sound small, but the footprint is highly concentrated in a limited number of basins where “one more” industrial user is not an abstract discussion.
Concrete examples are already on the table. A UC Riverside study found that a roughly two-week training run for GPT-3 at Microsoft’s U.S. data centers consumed about 700,000 liters of fresh water (around 185,000 gallons) for cooling. The same training conducted in a hotter region or less efficient facility could have tripled that usage. On the usage side, every 20 to 50 user queries to ChatGPT consume about 500 milliliters of water, once you combine cooling at the data center with water use at the power plant.
Individually that is trivial. At the scale of millions of daily users, it adds up. These “water costs” are invisible to the person typing the question, but not to the local utility.
They are not invisible to communities, either. Many data-center clusters have sprung up in arid or drought-prone regions such as Arizona and central Oregon. In one Georgia county, officials reportedly raised water rates by 33%, largely to cover the added demand from a new data center. The people in that story do not get free AI credits as a consolation prize. They get a higher bill.
AI as a New Power Pressure Point
If you zoom out to climate, AI’s operational emissions are mostly power-sector emissions. That makes policy design both simpler and more urgent.
In the United States, about 60% of the electricity used by data centers still comes from fossil fuels, with roughly 43% from natural gas and 15% from coal in 2024. Globally, data centers account for an estimated 2.5 to 3.7% of greenhouse gas emissions, a larger share than the entire aviation industry. One Brookings analysis warns that if current trajectories continue, AI-related computing could consume as much as 21% of global electricity by 2030, making it much harder to cut emissions in other sectors.
Efficiency improvements in chips and cooling are real. But as an OECD report put it, “energy efficiencies are outpaced by growth in AI computing.” The demand curve is steeper than the efficiency curve. So, in less economisty terms, that’s a problem.
Some tech firms have responded by pursuing dedicated clean power. One prominent example: plans to restart the Three Mile Island nuclear plant in Pennsylvania to power Microsoft’s AI data centers, a move that underlines just how large and concentrated these loads have become.
Most facilities, however, are still plugged into the existing grid mix. Unless policy catches up, the AI boom risks locking in higher emissions right when other parts of the economy are trying to decarbonize.
Speculation or Structural Cost?
So, is AI in a bubble? It depends on what you think markets are pricing.
If you believe that valuations fully incorporate the cost of building new generation and transmission, the price of water in stressed basins, and the political risk of asking ratepayers to subsidize Silicon Valley’s experiments, then perhaps not.
But much of the public narrative around AI treats electricity and water like flat, reliable background conditions. It assumes that power will remain cheap and available, that water conflicts will be manageable, and that regulators will continue to socialize infrastructure costs. That is a very specific macro bet!
Burry’s positioning tells a different story. Short AI-exposed stocks. Long water rights, utilities, and water-rich land. He is not betting against technology so much as he is betting that scarcity wins in the end.
From a portfolio perspective, AI now carries a kind of resource beta. Firms and regions that are aggressively exposed to AI workloads are also exposed to local grid constraints, climate policy, drought, and community pushbacks. Those risks are not yet front and center in earnings calls. They should be.
Policy Levers to Properly Price “Intelligence”
If AI’s environmental and fiscal costs concentrate in power systems and water basins, policy should concentrate there too.
Start with electricity! AI’s operational emissions are, at present, basically power-sector emissions. So, the main lever is to ensure the marginal power serving data centers is low-carbon, via a carbon price, a clean-electricity standard, or subsidies that achieve the same end. With US data center demand set to more than double by 2030, cleaning up the grids where AI actually locates matters more than anything else for its climate footprint.
Next, apply cost causality. Rapid load growth encourages utilities to lean on existing coal and gas unless developers share the costs of new generation and grid upgrades. That is why federal guidance, a 2025 executive order on federal land, and emerging state laws all push the same rule: if a data center adds major demand, it helps pay for the capacity and the wires (think of when a developer needs to pay for the supporting roadways). This protects ratepayers and nudges AI towards places where clean power can be built quickly.
Finally, make water stress and local air pollution part of siting. These impacts are geographically specific and politically sharp. Using water-stress indicators, prioritizing non-potable cooling water, and pricing withdrawals to reflect scarcity can steer projects away from overdrawn basins. Water policy also shapes the emissions mix, since several low-carbon options are water-intensive. The Georgia county that raised water rates by 33% to cover data center demand is what happens when siting ignores that reality.
Facing the Fine Print
Writing this, I had to sit with my own discomfort. I like AI. I use it. It makes my work faster and often better. And in this, it is tempting to treat the environmental and infrastructural footprint as someone else’s problem… preferably far away.
But Burry’s trades are a reminder that “someone else’s problem” is often just “tomorrow’s problem in a different asset class.” And it is not the opinion of this economist that he is not betting against innovation. He is instead betting that, in the long run, physics and cash flows beat narratives.
AI will not collapse because it uses electricity and water. But it will stumble, or at least become far more expensive, if we keep pretending those inputs are free.
Ergo, the real question is not whether we can afford to slow AI down. It is whether we can afford to keep underpricing the power and water that make it possible.
If we do not confront those externalities now, we risk communities paying for infrastructure they did not ask for, investors nursing write-downs on stranded assets, and the rest of us still hitting “generate,” unaware of how much we have already spent to keep the illusion of weightless intelligence alive.
Follow-up Thoughts:
For regulators and policymakers, please do not be treat AI as a futuristic abstraction. It is already a real, fast-growing industrial load sitting on today’s grids and water systems. The immediate job is to make it legible and price it properly. That means requiring standard, public reporting of electricity and water use for large data centers, then baking those figures into resource plans and drought assessments. It also means applying cost-causality so that developers pay the incremental cost of the generation, transmission, and water upgrades they trigger, rather than spreading those bills across households and small firms. Finally, if governments are offering tax breaks to attract AI investment, those incentives should be tied to additional clean power, credible grid expansion plans, and water-stress safeguards. These steps do not slow innovation; they simply ensure AI grows inside the same adult economy as everything else, where inputs are scarce and trade-offs are explicit.
For reporters and investors, the story needs a utility lens, not just a tech lens. Three questions should become routine whenever a new AI campus is announced or an earnings call celebrates “AI scale”:
From where is the power derived; and, is the firm adding new clean capacity or leaning on fossil-heavy grids?
Who is paying for the extra substations, transmission lines, and water infrastructure? The developer or the public?
How fragile is the business model if carbon prices rise, clean-energy standards tighten, drought limits withdrawals, or heatwaves constrain both rivers and power plants?
None of this is anti-AI. It is basic due diligence about whether the economics of scale still hold once the full utility bill is brought into view. And I am just not seeing it…
Ask those questions consistently, and the narrative will shift. That is the difference between a durable technology boom and a very expensive game of musical chairs.
Sources:
Bistline, John, Kimberly A. Clausing, Neil R. Mehrotra, James H. Stock, and Catherine Wolfram. “Climate Tax Policy Reform Options in 2025.” Brookings Institution, February 27, 2024. https://www.brookings.edu/articles/climate-tax-policy-reform-options-in-2025/.
Cho, Renée. “AI’s Growing Carbon Footprint.” Columbia Climate School, June 9, 2023. https://news.climate.columbia.edu/2023/06/09/ais-growing-carbon-footprint/.
Daly, Matthew, and Tom Krisher. “EPA Issues New Auto Rules Aimed at Cutting Carbon Emissions, Boosting Electric Vehicles and Hybrids.” Associated Press News, updated March 21, 2024. https://apnews.com/article/epa-electric-vehicles-emissions-limits-climate-biden-e6d581324af51294048df24269b5d20a.
Danelski, David. “AI Programs Consume Large Volumes of Scarce Water.” UC Riverside News, April 28, 2023. https://news.ucr.edu/articles/2023/04/28/ai-programs-consume-large-volumes-scarce-water.
Harnish, Molly. “Taxing Away the Problem.” Econ Focus (Federal Reserve Bank of Richmond), Second/Third Quarter 2019. https://www.richmondfed.org/publications/research/econ_focus/2019/q2-3/feature1_sidebar
Kane, Joseph W. “AI, Data Centers, and Water.” Brookings Institution, November 20, 2025. https://www.brookings.edu/articles/ai-data-centers-and-water/.
Lee, Nicol Turner, and Darrell M. West. “The Future of Data Centers.” Brookings Institution, November 5, 2025. https://www.brookings.edu/articles/the-future-of-data-centers/.
Leppert, Rebecca. “What We Know about Energy Use at U.S. Data Centers amid the AI Boom.” Pew Research Center, October 24, 2025. https://www.pewresearch.org/short-reads/2025/10/24/what-we-know-about-energy-use-at-us-data-centers-amid-the-ai-boom/.
Li, Yun. “Michael Burry Launches Newsletter to Lay Out His AI Bubble Views after Deregistering Hedge Fund.” CNBC, November 24, 2025, updated November 24, 2025. https://www.cnbc.com/2025/11/24/michael-burry-launches-newsletter-to-lay-out-his-ai-bubble-views-after-deregistering-hedge-fund.html.
Lopez, Linette. “The Genius from ‘The Big Short’ Explains His Biggest Trade Right Now.” Business Insider, December 29, 2015. https://www.businessinsider.com/michael-burry-water-trade-2015-12.
Metcalf, Gilbert E. “On the Economics of a Carbon Tax for the United States.” Brookings Papers on Economic Activity, Spring 2019. https://www.brookings.edu/wp-content/uploads/2019/03/Metcalf_web.pdf.
National Academies of Sciences, Engineering, and Medicine. Accelerating Decarbonization in the United States, October 2023. https://nap.nationalacademies.org/resource/25931/interactive/.
Ren, Shaolei. “How Much Water Does AI Consume? The Public Deserves to Know.” OECD.AI, November 30, 2023. https://oecd.ai/en/wonk/how-much-water-does-ai-consume.
Stock, James H. “Driving Deep Decarbonization.” Finance & Development (International Monetary Fund), September 2021. https://www.imf.org/en/Publications/fandd/issues/2021/09/how-to-drive-deep-decarbonization-stock.
Thomas, Vinod. “Mainstream Economic Policy Must Factor Climate Change into Its Growth Calculus.” Brookings Institution, April 26, 2022. https://www.brookings.edu/articles/mainstream-economic-policy-must-factor-climate-change-into-its-growth-calculus/.
Zewe, Adam. “Explained: Generative AI’s Environmental Impact.” MIT News (Massachusetts Institute of Technology), January 17, 2025. https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117.

