The Wall
How the U.S. Power Grid Became the AI Industry’s Most Expensive Problem
Google pulled its campus proposal for Franklin Township, New Jersey, on September 16, 2025, minutes before the local planning council was scheduled to vote. The company said nothing publicly. Residents had packed the meeting room to oppose it. The site had been chosen for its land, its tax incentives, its proximity to fiber corridors. What it could not offer was power. The regional utility had already told Google what it tells everyone in the PJM Interconnection zone: the queue runs five to seven years.
That queue holds 2,700 gigawatts of pending requests. The entire U.S. grid produces roughly 1,100. The backlog is the physical record of an industry that spent two years promising infrastructure it could not build.
In 2025 alone, 48 major data center projects valued at $156 billion were blocked, canceled, or stalled. Those projects represent 4.7 gigawatts of lost computing capacity, enough to power 3.5 million homes. Big Tech’s collective infrastructure spend this year is projected at a record $630 billion. The utilities, the townships, the water boards, and the environmental agencies who were expected to absorb the expansion are voting no.
Between 2021 and 2024, the constraint was semiconductor supply. Nvidia H100s had a six-month lead time. A100s sold on gray markets for three times list price. The hyperscalers planned around chip availability, land costs, and labor pools, announced campuses in Virginia, Texas, Iowa, and Arizona, and told shareholders the construction timelines were aggressive but achievable.
The grid was not part of the model.
Transmission infrastructure in the United States is governed by a patchwork of regional operators, state utility commissions, and federal agencies designed to serve slowly growing demand. The average U.S. grid added roughly 20 gigawatts of generation capacity per year through the 2010s. AI data center demand alone is projected to require 80 gigawatts of additional capacity by 2030. The math was always visible. The industry treated it as someone else’s problem until the utilities started refusing connection requests.
Virginia, which hosts roughly 40 percent of the world’s hyperscale capacity in a corridor stretching from Loudoun County south through Prince William, saw at least 15 project cancellations and indefinite deferrals in the first quarter of 2026. Dominion Energy, the primary utility serving the region, has been explicit: its transmission upgrade backlog extends past 2030, and it cannot accept new large-load customers in most of its service territory without regulatory relief that has not arrived. The company projects peak data center demand in Virginia could rise to 13.3 gigawatts by 2038, nearly five times the 2.8 gigawatts recorded in 2022. Getting there requires an unprecedented level of new generation construction and a doubling of power imports from the rest of PJM.
The interconnection queue in Virginia, Texas, and Arizona operates on the same timeline. A company that broke ground in 2024 expecting to go live in 2026 is looking at 2031. The capital is deployed and the revenue is not coming.
Power is the first wall. Water is the second. A single large-scale AI training facility can consume more than one million gallons per day for cooling. Hyperscale facilities running multiple clusters push that figure considerably higher. In Phoenix, where the Salt River Project has been managing supply stress for a decade, a Meta facility projecting consumption of four million gallons daily drew a formal opposition coalition by mid-2024 and has been in regulatory limbo since.
Meta pulled a separate proposal, a $1 billion campus in Michigan, after months of public opposition focused on water usage and the company’s refusal to disclose operational consumption figures. The campus had been framed as economic development: 900 construction jobs, permanent technical employment, local tax base expansion. Residents asked for the water numbers. Meta declined. The project died.
Google’s withdrawal from Franklin Township, with the application pulled rather than argued at the last moment, is its own data point about internal confidence in the regulatory environment. The residents who organized around noise concerns and infrastructure strain never heard the company’s case.
New Orleans imposed a one-year data center moratorium in early 2026, citing strain on Mississippi River water capacity. South Dakota and New York are drafting similar measures. Virginia’s 2026 General Assembly session was consumed by 61 bills targeting the industry, a volume that reflects how quickly the political ground has shifted. Fifteen of those bills reached Governor Abigail Spanberger’s desk, with a deadline of April 13. One requires localities to conduct sound studies and assess broader community impacts before approving new projects. Another bars new diesel generator permits after July 2026 unless operators use cleaner Tier IV equipment — a provision aimed directly at the 4,000-plus diesel generators that Loudoun County data centers already hold permits for, a combined backup capacity exceeding 11 gigawatts, more than Dominion’s entire natural gas fleet. The state budget has stalled over a separate fight: Senate President L. Louise Lucas is refusing to pass a budget that preserves the data center sector’s $1.6 billion annual sales tax exemption. A special legislative session is scheduled for April 23. The communities near these facilities are not ideologically opposed to technology. They are doing arithmetic, and the arithmetic is not working in the industry’s favor.
The most consequential cost transfer from the AI infrastructure build-out has not happened in a township meeting or a water board hearing. It has happened inside the PJM capacity market, where the grid operator that serves 65 million Americans across 13 states and Washington holds annual auctions to ensure enough generation is available to keep the lights on.
In the 2024/25 delivery year, capacity cleared at $28.92 per megawatt-day. In the 2025/26 auction, it cleared at $269.92 — an 833 percent jump, the largest single-year increase in the market’s 27-year history, driven 63 percent by data center demand growth. That one auction added $9.3 billion in costs that utilities are now recovering from residential and commercial customers in higher rates. In Washington D.C., Pepco residential customers saw their bills rise by $21 a month starting June 2025; roughly half of that increase is directly attributable to data center-driven capacity costs, according to the D.C. Office of Consumers’ Counsel.
The 2026/27 auction cleared at $329.17 per megawatt-day, hitting the FERC-approved price cap for the second consecutive year. The December 2025 auction for the 2027/28 delivery year hit a new price cap of $333.44 and, for the first time in the market’s history, PJM failed to secure enough capacity to meet its 20% reliability reserve target falling short by 6,625 megawatts. Without the price cap, the independent market monitor calculates that clearing prices would have exceeded $500 per megawatt-day, adding another $9.9 billion to ratepayer costs.
Across PJM’s last three capacity auctions, data center forecasts account for $21.3 billion of the $47.2 billion in total capacity costs. The NRDC projects cumulative costs through 2033 of between $100 billion and $163 billion if the current trajectory holds, translating to roughly $70 a month for a typical PJM household. A Carnegie Mellon University study estimates data center and cryptocurrency demand could raise the average U.S. electricity bill by 8 percent nationally by 2030, with increases potentially exceeding 25 percent in the highest-demand markets of northern Virginia.
Virginia’s State Corporation Commission responded in November 2025 by approving Dominion’s first base rate increase since 1992: $11.24 a month for a typical residential customer in 2026 and $2.36 more in 2027. The commission also created a new rate class for customers using more than 25 megawatts, requiring them to sign 14-year contracts and pay for at least 85 percent of their contracted demand regardless of whether the data center is ever built. The provision is designed to stop utilities from building out infrastructure for facilities that never materialize and then recovering the cost from ratepayers. Senator Lucas’s SB 253, still pending, would go further, shifting distribution and PJM capacity auction costs directly onto large-load customers and away from residential bills, a move the SCC estimates would cut residential rates by 3.4 percent while raising data center rates by 15.8 percent. Dominion supports the legislation.
None of this has yet changed the price on anyone’s bill. The auctions are already settled. The costs are already flowing.
The capital expenditure is real. Microsoft committed $80 billion for fiscal year 2025. Amazon Web Services is projecting $105 billion in infrastructure investment this year. Alphabet is spending north of $75 billion. The announcements come at investor days with charts showing exponential demand curves, and the demand for AI inference, for model training, for the data processing underpinning every enterprise software product being re-platformed around large language models, is genuine.
What does not match the announcements is the revenue. OpenAI closed 2025 with a $5 billion net loss on $3.5 billion in income. Anthropic burned through $2 billion in its last reported quarter while generating a fraction of that back. The hyperscalers are subsidizing frontier model development against inference revenue that has not arrived.
DeepSeek, the Chinese AI laboratory that released its R1 model in January 2025 at a reported training cost of approximately $6 million, did something to the demand assumptions that has not been fully absorbed. If frontier-level performance can be achieved at a fraction of the compute previously thought necessary, the business case for 500-megawatt hyperscale campuses requires specific answers about which workloads demand that scale and when they will generate returns. The $630 billion in projected 2026 spending has not stopped to answer.
The industry’s response to grid constraints is to leave the grid. Microsoft’s 2 GW nuclear commitment with Constellation Energy, running through 2040 and structured as the largest corporate nuclear agreement in history, is the clearest signal of where this is heading. The arrangement bypasses the PJM queue entirely and provides dedicated baseload power to Microsoft data centers without contributing to the shared grid on which residential customers depend. The original Three Mile Island restart deal, worth roughly $1 billion over 20 years, has expanded: Microsoft has committed to six gigawatts of nuclear-sourced power by 2030.
Amazon agreed to purchase output from the Susquehanna Steam Electric Station from Talen Energy, a deal federal regulators have scrutinized because it would pull power off the shared grid and dedicate it to a single private buyer. Google has signed letters of intent with three small modular reactor developers, though no SMR has produced commercial power in the United States and construction timelines for any of them run past 2028.
Oracle is deploying gas turbines at its Texas campuses. That decision costs 20 percent more upfront than grid connection and produces carbon emissions the company publicly committed to eliminating. The commitment and the turbines coexist in the same annual report.
The companies are responding to grid failure by privatizing their energy supply: pulling power from shared grids, competing for the output of reactors that also serve residential customers, building generation assets that reproduce, at private scale, what governments built for the public at public expense. The cost transfer is structural.
Ireland hosts a disproportionate share of European hyperscale capacity. Data centers now consume roughly one-fifth of the country’s total electricity, and that share is rising faster than generation can be added. In 2024, EirGrid warned of rolling blackout risk during peak winter demand. Microsoft and Google each announced additional Dublin-area campuses the same year.
In Chile, a Google data center near Santiago drew sustained protest from indigenous communities in the Aconcagua Valley, where water drawn from Andean snowmelt already supports agriculture on strained aquifers. The facility’s cooling requirements, mapped against existing agricultural and residential demand, show a net deficit in dry-season months. The protests have continued. Construction has continued. The question of who absorbs that deficit has not been formally answered.
The IEA now projects global data center electricity consumption will reach 1,100 terawatt-hours in 2026, equivalent to Japan’s entire national electricity consumption. The figure represents an 18 percent upward revision from estimates published just months earlier.
The costs of the AI infrastructure build-out are not being absorbed by the companies building it. They land on utility ratepayers whose bills rise to fund grid upgrades the utilities did not plan for. They land on townships that find their water capacity claimed by a facility that pays taxes and employs sixty permanent staff. They land on grids built for a demand profile that no longer describes the world they are serving. They land inside a capacity auction that 65 million people have never heard of, where prices rose 833 percent in a single year and the bill arrived in June.
The companies build where they can. Where they cannot, they announce and then quietly withdraw. What remains is the substation that was never upgraded, the aquifer already stressed before the permit was filed, and the township that will be asked again next year by a different company with the same proposal and the same omissions in its environmental filing.




