The Pragmatist’s Guide To Navigating AI Narratives
How to Read the AI Supercycle When Every Narrative Is Simultaneously True
The Day The Language Changed
On November 19, 2025, NVIDIA released its third-quarter earnings. The numbers delivered: $57 billion in revenue, up 62% year-over-year. Jensen Huang said Blackwell sales were “off the charts.” Analysts parsed the data center segment: $51.2 billion, 90% of total revenue. The AI supercycle is still in full force.
But something subtle caught my eye in the press release. NVIDIA announced partnerships in new quantitative terms:
“A strategic partnership with OpenAI to deploy at least 10 gigawatts of NVIDIA systems.”
“Anthropic will initially adopt 1 gigawatt of compute capacity.”
Not GPU counts. Not dollars. Gigawatts.
A gigawatt powers roughly 700,000 American homes. OpenAI’s 10 gigawatts equals the electrical load of Houston.
The language changed because the constraint changed. The bottleneck on AI development is no longer chip manufacturing. It’s electrical capacity, and if you’ve been following the headlines you know that narrative all too well.
Three months earlier, in August, Jensen Huang had explained this to analysts, though few noticed at the time: “Out of a gigawatt AI factory, which can go anywhere from $50 billion-$60 billion, we represent about 35% plus or minus of that.”
NVIDIA captures $18-21 billion. The other $30-40 billion? Power plants. Substations. Cooling towers. Grid transmission. Buildings.
Three Ways To Read What Happens Next
What follows is a timeline of announced projects, capital flows, and site selections from 2024-2027. The same facts support three different interpretations, not because the data is ambiguous, but because the infrastructure itself operates at a scale where interpretation collapses.
The Optimistic Case: Manufacturing Renaissance
The AI boom is rebuilding American manufacturing. Companies need massive electrical capacity immediately. That capacity exists in one place: former industrial sites in the Rust Belt: old car plants, shuttered steel mills, places where the grid was built to handle 24/7 heavy industrial loads.
When factories closed, the electrical infrastructure remained. Now AI companies are repurposing it, bringing capital and activity to precisely the regions devastated by decades of financialization disguised as globalization. The “AI infrastructure” framing unlocks political support and regulatory fast-tracking that “bring back manufacturing” could never achieve.
The renaissance is real because the $1.2 trillion in announced investments represents actual construction, not just press releases. Toyota in Kentucky, TSMC in Arizona, Foxconn in Texas—these are physical facilities breaking ground.
The Cynical Case: Subsidy Arbitrage
It’s narrative arbitrage, actually. Projects that would be rejected as “data centers” get approved as “strategic AI infrastructure.” Sites that couldn’t attract investment as “manufacturing” become venture-backable as “AI factories.”
Look at what’s actually being “manufactured”—servers, semiconductor equipment, AI infrastructure components. Not consumer goods. Not the kind of manufacturing that built the middle class. And notice the framing: “collaborative robots to overcome labor shortages” means automation from day one. Jobs are not coming back.
In Ohio, data center tax breaks amount to over $2 million per permanent full-time job. That’s not job creation. That’s public subsidy flowing to private profit. Constellation Energy’s CEO warns: “I think the load is being overstated.” When the CEO of a major power provider says demand projections look inflated, pay attention.
The Geopolitical Case: Infrastructure Determines Destiny
The United States is losing the AI race because China adds more electricity demand annually than Germany’s total consumption. One Chinese province alone matches India’s entire electricity supply.
American AI researchers visiting China report that energy availability is treated as “a solved problem.” Chinese energy planning happens in anticipation of demand, not in reaction to it.
The U.S. is choosing data center locations based on where electrical infrastructure already exists because building new infrastructure takes seven years. That’s not strategy. That’s triage. The gigawatt language reveals a scramble dressed up as a plan.
China built power first, then AI. America is trying to build AI, then scrambling for power. In infrastructure races, the side that builds foundations first usually wins.
With those three narratives established let’s map out the timeline and see how those narratives play out.
Timeline: 2024-2027
2024: The Constraint Emerges
U.S. AI data centers consumed 4 gigawatts of power. Natural gas turbines sold out through the end of the decade. Grid interconnection queues backed up. Companies started realizing new power infrastructure wouldn’t arrive fast enough.
Q1 2025: Site Selection Begins
January: Trump administration launches. Stargate project announced.
February: Former GM Lordstown plant—6.2 million square feet, shuttered in 2019—targeted for Foxconn/OpenAI/SoftBank AI equipment manufacturing.
Lordstown sits in Youngstown, Ohio. Trump visited in 2017 and told workers “don’t sell your house” because jobs were coming back. GM closed the plant two years later. Now it’s being converted for AI infrastructure. Former car factory workers are not going to convert to silicon wafer producers.
Optimistic signal: Industrial site getting second life. Investment flowing to devastated region.
Cynical signal: Why Lordstown? Not for talent or universities. For the electrical substation sized to stamp 400,000 cars annually. The power infrastructure is what’s valuable, not the people.
Geopolitical signal: Speed matters. Lordstown is expected operational by 2026. That’s only possible because the electrical infrastructure exists. Greenfield sites would take until 2030. That would mean China is moving on to quantum AI while we’re still dicking around with data center permits.
Q2 2025: Export Controls and Site Selection
April: U.S. government requires export licenses for NVIDIA H20 chips to China. NVIDIA takes $4.5 billion inventory charge.
May: NVIDIA loses an additional $2.5 billion in blocked H20 shipments.
June: Amazon announces $20 billion Pennsylvania investment at two sites: Salem Township, adjacent to Susquehanna nuclear plant with direct power connection, and Falls Township at former US Steel mill.
The export controls cut off China’s market for AI chips—$7 billion in immediate losses for NVIDIA. One month later, Amazon announces $20 billion in domestic AI infrastructure. The timing reveals the strategic pivot: lost international markets accelerate domestic buildout.
But the site selection tells a different story. Salem Township matters because of direct nuclear power connection. Falls Township matters because it’s a former steel mill with grid infrastructure built for continuous heavy industrial loads.
Optimistic signal: Export controls forcing domestic investment. Former industrial sites getting $20 billion in capital. Real construction creating regional economic activity where it’s most needed.
Cynical signal: Amazon choosing sites purely for electrical access, not economic development. Tax breaks to attract projects probably exceed local benefit. The “reshoring” narrative covers straightforward infrastructure arbitrage.
Geopolitical signal: Export controls are the weapon, but they cut both ways. U.S. blocks China from advanced AI chips, forcing $7B in immediate losses on American companies. China responds by accelerating domestic chip development and treating energy availability as “a solved problem” while U.S. scrambles for sites with adequate power. The export ban assumes American AI advantage is durable. But if the real constraint is electrical infrastructure, not chip design, the ban may be a strategic blunder.
Q3 2025: The Language Pivot
August 27, NVIDIA Q2 FY2026 earnings call: Jensen Huang tells analysts: “Out of a gigawatt AI factory, which can go anywhere from $50 billion-$60 billion, we represent about 35% plus or minus of that.”
“It takes six chips, six different types of chips, just to build an AI and a Rubin AI supercomputer. Just to scale that out to a gigawatt, you have hundreds of thousands of GPU compute nodes.”
Huang is telling investors that NVIDIA is now an infrastructure company. And he’s revealing that 65% of a gigawatt facility’s cost isn’t chips. It’s power and buildings.
Power is now front and center as the constraint. The question is whether this creates opportunity (optimistic), extraction (cynical), or disadvantage (geopolitical).
September 2025: The Mega-Announcements
September 22: OpenAI and NVIDIA announce partnership for at least 10 gigawatts. NVIDIA will invest up to $100 billion progressively as each gigawatt deploys.
Huang tells CNBC this equals 4-5 million GPUs—roughly NVIDIA’s entire 2025 production, “twice as much as last year.”
One partnership equals one year of total production.
September 23: OpenAI, SoftBank, and Oracle unveil five new Stargate sites: Shackelford County TX, Doña Ana County NM, Lordstown OH, Milam County TX, and one undisclosed Midwest location.
Look at the geography. Not one California site. Not one Massachusetts site. The AI revolution is being built in the industrial heartland.
Sam Altman explains site selection: “You need a lot of energy availability. You want a favorable regulatory environment that allows construction, fast permits. You need the talent... you need the land.”
Energy first. Then regulations. Then talent (imported, not local).
Optimistic signal: $100 billion flowing to Ohio, Texas, New Mexico. These regions need the investment.
Cynical signal: Industry insiders report “similar projects that look exactly to have the same footprint being requested in different regions across the country” meaning companies are shopping the same project to multiple utilities to extract better deals.
Geopolitical signal: Speed trumps quality. These sites weren’t chosen because they’re optimal. They were chosen because they’re possible.
October 2025: The Manufacturing Question
October 28: NVIDIA announces manufacturing partnerships: Belden, Caterpillar, Foxconn, Lucid Motors, Toyota, TSMC, Wistron. $1.2 trillion in U.S. production capacity investments announced, led by electronics, pharma, semiconductors.
Optimistic interpretation: This is real. $1.2 trillion isn’t accounting fiction. Toyota doesn’t announce Kentucky investments for press release purposes. TSMC doesn’t break ground in Arizona for optics.
These are actual facilities being built or modernized. The AI framing gives political cover to do what “bring back manufacturing” couldn’t do.
Cynical interpretation: Look at what’s being manufactured—AI servers, semiconductor equipment, infrastructure components. Not consumer goods. Not products Americans buy. The same cloud oligarchy is weaponizing their balance sheets to ensure no new competition emerges. Moreover, a factory that runs on robots doesn’t create the jobs that built the middle class.
Geopolitical interpretation: While America announces partnerships, China executes coordinated buildout. Energy planning is technocratic, long-term, and happens before investment rather than after.
November 2025: The Earnings That Confirmed It
Which brings us back to the present. NVIDIA Q3 2025: $57 billion revenue, $51.2 billion from data centers.
Partnerships announced in gigawatts:
OpenAI: 10 gigawatts
Anthropic: 1 gigawatt
The language has fully shifted.
Q4 guidance: $65 billion revenue expected.
Buried in analyst materials: AI inference demand will reach 400% of training workloads by 2027, up from under 50% in 2022.
Training can happen anywhere with power. But inference, the actual use of AI, must happen near users and applications. Autonomous driving, industrial automation, AR/VR all require sub-10ms latency.
You cannot serve a self-driving car in Detroit from a data center in Texas. Physics prevents it.
If inference scales to 400% of training by 2027, you don’t need ten gigawatt training centers. You need hundreds of distributed inference facilities near populations and industry. Which means electrical infrastructure everywhere. Which means old industrial sites become the only option at scale.
Optimistic interpretation: Distributed inference validates the industrial site strategy. AI naturally flows to where manufacturing and population are, creating integrated industrial ecosystems.
Cynical interpretation: “Distributed inference” is making virtue of necessity. Can’t build gigawatt centers fast enough, so claim distribution was the plan all along.
Geopolitical interpretation: Distributed architecture plays to American strengths (existing industrial infrastructure, geographic spread) versus Chinese strengths (centralized planning, concentrated buildout). But only if execution matches strategy.
2025 Year End: The Numbers
Demand projections:
68 GW global AI data center demand by 2027
327 GW by 2030, versus total global data center capacity of just 88 GW in 2022
U.S. alone: 4 GW in 2024 to 123 GW by 2035
Supply reality:
Goldman Sachs estimates $720 billion in grid spending needed through 2030
Grid connections taking 4-7 years in key regions
Data center builds take 1-2 years; grid connections take 7 years in some locations
68 GW needed by 2027. That’s two years away. Grid connections take 4-7 years. Projects announced today won’t connect to new power until 2029-2032.
The only way to meet 2027 targets is existing infrastructure. Hence Lordstown, Salem Township, Falls Township.
Optimistic: Necessity drives innovation. Can’t wait for new grid? Use old industrial sites. Can’t get permits? Call it “AI infrastructure” and regulators prioritize it. The constraint creates the solution.
Cynical: The math reveals the game. Demand projections justify investment. Investment justifies subsidies. When actual utilization comes in below projection, who’s left holding stranded assets? Taxpayers.
Geopolitical: The math reveals American disadvantage. China built infrastructure first, enabling rapid deployment now. America is scrounging for existing capacity while China executes planned buildout. Two years from now shows who was right.
2026-2027: The Test Period
This is where narrative meets reality.
Capacity utilization rates: Do announced facilities operate at projected density? Or do megawatt ratings exceed actual deployment?
Power prices: Wholesale electricity already up 267% in some markets near data centers. If prices spike further, it validates scarcity. If they stabilize, it suggests adequate supply or overstated demand.
Employment vs automation: Do these facilities create meaningful jobs? Or is the focus on “collaborative robots” code for minimal human employment?
Inference deployment: Does the 400% inference growth materialize? More important: where does it happen, centralized cloud or distributed edge?
Chinese buildout pace: Does the U.S. infrastructure gap close, widen, or stay constant?
The Pragmatist’s Position
When a chip company starts announcing deals in electrons instead of dollars, something has fundamentally shifted. NVIDIA became an infrastructure company the moment partnerships were measured in gigawatts—because at that scale, the product being sold is access to electrical capacity, not computing capability.
Gigawatt-scale AI infrastructure operates at a threshold where economic development, financial engineering, and geopolitical competition become functionally indistinguishable until someone actually turns the power on.
Which reveals the fourth narrative hidden underneath the other three: the electrical constraint is the only coordinating force.
Not institutions. Not strategy. Not markets. The bottleneck itself is performing the coordination function—forcing economic, financial, and strategic decisions into alignment because nothing else can.
When constraints govern instead of institutions, you get the appearance of strategy without strategic choice. Every decision looks deliberate but is actually the constraint expressing itself through whatever institutional channel encounters it first. The system produces results that satisfy the bottleneck without optimizing for any coherent goal.
This is what infrastructure looks like after sixty years of institutional and infrastructure decay. Not the absence of institutions, but their subordination to physical limitations that now perform the coordinating work leaders once did. The constraint doesn’t adjudicate between narratives. It just enforces its terms and lets the players rationalize the narrative afterward.
-Nathan Staffel


