Top CEOs Warn AI Boom Is Driving Critical Shortages of Power, Computing Capacity, and Chips
TL;DR
The AI boom is creating cascading shortages across power grids, semiconductor supply chains, and computing infrastructure, with data center electricity consumption projected to more than double by 2030. As CEOs like BlackRock's Larry Fink and Nebius's Arkady Volozh warn of structural scarcity, ordinary ratepayers are already absorbing billions in higher electricity costs, while startups and researchers face growing barriers to compute access — raising questions about who benefits from the AI buildout and whether the shortage warnings themselves serve corporate interests.
"We're short power, we're short compute, we're short chips." That was BlackRock CEO Larry Fink at the Milken Institute Global Conference in May 2026, distilling the AI industry's central problem into nine words . He is not alone. From Microsoft to Meta, from NVIDIA to upstart cloud providers like Nebius, the message from the executive suite is uniform: the physical world cannot keep up with the digital one.
The question is whether that's a warning or a sales pitch.
The Power Equation
Global data center electricity consumption reached an estimated 415 terawatt-hours (TWh) in 2024, roughly 1.5% of all electricity generated worldwide . In the United States, the figure was 183 TWh — more than 4% of national consumption . By 2030, the International Energy Agency and Gartner project global data center consumption will reach 980 TWh, more than doubling from current levels .
The growth rate for AI-specific computing is steeper still. Electricity consumption in accelerated servers — the GPU-dense machines that train and run AI models — is growing at roughly 30% per year, compared to 9% for conventional servers . Bain & Company estimates that total AI data center power demand hit 21 gigawatts (GW) in 2025, a nearly fivefold increase from 4.3 GW in 2020, and projects it could reach 68 GW by 2027 and 327 GW by 2030 .
To put that in perspective, 327 GW is roughly equivalent to the entire installed electricity generation capacity of India. Meeting that demand would require 75 to 100 GW of new generating capacity globally by the early 2030s, according to Morgan Stanley . The constraint is not just building power plants — it is the years-long permitting and interconnection process. Bringing new power generation, transmission, and distribution online in the United States typically takes four years or longer .
The Memory Wall
While headlines have focused on GPU shortages, the binding constraint on AI chip production has shifted. According to Epoch AI, the four largest AI chip designers — NVIDIA, Google, AMD, and Amazon — consumed over 90% of global CoWoS advanced packaging capacity and High Bandwidth Memory (HBM) supply by value in 2025, but only about 12% of advanced logic die production . The bottleneck is no longer the GPU chip itself. It is the specialized memory and packaging that surround it.
HBM — the high-speed DRAM stacked vertically alongside AI processors — is manufactured by just three companies: SK Hynix, Samsung, and Micron. All three have sold out their entire 2026 production. "We have already sold out our entire 2026 HBM supply," SK Hynix CFO Kim Jae-joon told analysts, adding that shortages may persist until late 2027 . Samsung has raised HBM prices by high-teens to low-twenties percent in 2026 contracts .
The mismatch is stark: AI infrastructure demand for HBM is growing at over 80% per year, while total DRAM supply grows at roughly 16% annually . Data centers are projected to consume 70% of all memory chips produced globally in 2026 . TSMC's CoWoS advanced packaging capacity — the technology that bonds HBM to GPU dies — remains sold out through 2026, and industry analysts say these structural bottlenecks will shape pricing, lead times, and availability well into 2027 .
The Trillion-Dollar Buildout
Hyperscaler capital expenditure on AI infrastructure has been escalating at a pace with few historical parallels. Combined spending by Microsoft, Amazon, Alphabet, and Meta on capital expenditures is expected to exceed $710 billion in 2026 alone, with much of it directed at AI infrastructure . Across the industry, Goldman Sachs estimates global AI-related infrastructure spending could approach $1 trillion over the next several years, while hyperscalers are projected to invest $7 trillion globally in data center infrastructure through 2030 .
Fink has gone further, arguing that the scarcity itself will birth a new financial market. "A new asset class will be buying futures of compute," he said, envisioning contracts that guarantee future access to AI processing capacity — analogous to how airlines lock in fuel prices through oil futures . BlackRock's thesis, backed by a $14 trillion asset management platform, is that compute will become a tradeable commodity .
Nebius Group, the AI cloud company led by former Yandex founder Arkady Volozh, has positioned itself squarely in this supply gap. The company reported $529.8 million in fiscal 2025 revenue, a 479% year-over-year increase . A $2 billion strategic investment from NVIDIA in March 2026 gave Nebius "Preferred Provider" status, ensuring hardware priority during chip shortages . Meta followed with a $27 billion deal, including $12 billion in dedicated AI infrastructure capacity beginning in early 2027 .
Who Pays the Bill
The infrastructure buildout is not cost-free for ordinary consumers. The clearest evidence comes from PJM Interconnection, the regional transmission organization that manages the electric grid for 65 million people across 13 states and Washington, D.C. In PJM's 2026-2027 capacity auction, prices surged from $28.92 per megawatt-day to $329.17 per megawatt-day — an 833% increase driven largely by data center demand .
Data centers were responsible for 63% of the price increase in the 2025-2026 auction, translating to $9.3 billion in costs absorbed by all ratepayers — residential, commercial, and industrial alike . In Washington, D.C., Pepco residential customers saw bills increase by an average of $21 per month starting in June 2025; consumer advocates estimate roughly half of that increase traces to data center-driven capacity costs . In Ohio, the average monthly increase was about $16 .
Beyond utility rates, taxpayers subsidize data centers through tax exemptions. Texas is losing an estimated $1 billion in fiscal 2025 revenue to data center subsidies, while Virginia's data center sales-tax exemption cost $1.6 billion in the same period . A Good Jobs First analysis found that these subsidies represent some of the most expensive industry-specific tax breaks in any state .
The political backlash has been swift. On March 4, 2026, several major data center developers signed the White House's Ratepayer Protection Pledge, committing to cover the full cost of new electric generation needed for their facilities . Twenty-seven states have advanced legislation requiring data center developers to pay for grid upgrades they trigger, with California, Ohio, and Utah already enacting such laws .
Historical precedent suggests these costs may not settle equitably. During the railroad boom of the 1860s-1880s, federal land grants and state subsidies flowed to private companies whose infrastructure later became toll-charging monopolies. The electrification of American industry in the early 20th century was funded by a mix of ratepayer-financed utilities and municipal power systems. In each case, the public bore significant upfront costs for infrastructure whose profits accrued primarily to private shareholders.
The Concentration Problem
The AI chip supply chain is concentrated to a degree that would alarm any risk analyst. TSMC holds 72% of the global foundry market, with front-end capacity at 3nm and 2nm nodes effectively fully booked by Apple and a handful of major customers . High-value AI chips drive roughly half of TSMC's total revenue while representing less than 0.2% of unit volume . The company's continued concentration in Taiwan presents an obvious geopolitical risk.
Diversification efforts are underway but limited. Apple has entered early-stage discussions with Intel and Samsung about manufacturing some M-series processors outside TSMC . Samsung signed a $16.5 billion foundry contract with Tesla in July 2025 for next-generation AI chips using 2nm technology — the largest long-term foundry deal ever signed with a single client . Intel's foundry ambitions, centered on its 18A process node, have shown promising early results but face significant operating losses and difficulty attracting external customers .
The comparison to pre-2022 neon gas dependency on Russia is instructive. Before Russia's invasion of Ukraine, two Ukrainian companies supplied roughly half the world's semiconductor-grade neon. The disruption forced rapid — and expensive — supply chain restructuring. TSMC's dominance in advanced AI chip manufacturing represents a similar single-point-of-failure risk, but one that is orders of magnitude larger in economic consequence and far more difficult to replicate.
Who Gets Crowded Out
As hyperscalers absorb the lion's share of available compute, chips, and power, smaller players are being squeezed. OpenAI, Google, and Meta are consolidating compute resources to such an extent that the rest of the U.S. market's share of compute has fallen from 18% to 9%, according to one forecast . Running a single large-scale AI-driven drug discovery campaign now costs $2 million to $10 million in compute alone; materials science screening runs $500,000 to $3 million .
Cloud providers offer research credit programs — AWS, Google Cloud, and Microsoft Azure provide grants ranging from $10,000 to $1 million for academic applicants — and the U.S. National Science Foundation allocated $140 million to AI-for-science initiatives in fiscal 2025 . But these sums are dwarfed by the hundreds of billions flowing to commercial AI infrastructure.
If power and compute constraints persist, the historical pattern of resource-constrained technology transitions offers guidance on who absorbs the pain. During the 2021 chip shortage, automakers — who accounted for a small fraction of total semiconductor revenue — were deprioritized by foundries in favor of higher-margin consumer electronics customers. Production lines idled, and an estimated $210 billion in revenue was lost across the auto industry. A similar dynamic could play out in AI: applications with the highest commercial returns — consumer products, advertising optimization, enterprise automation — would claim available resources first, while drug discovery, climate modeling, and public-sector research would be deprioritized.
The Steelman Case for Skepticism
There is a reasonable case that the shortage warnings from CEOs like Fink and Volozh are at least partly self-serving. BlackRock manages $14 trillion in assets and is actively building an AI infrastructure investment thesis; scarcity narratives increase the perceived value of those assets. Nebius's 479% revenue growth and its $2 billion NVIDIA investment are both predicated on sustained demand exceeding supply. If that scarcity perception fades, so does the valuation premium.
The broader evidence supports some skepticism. A study published by the National Bureau of Economic Research in February 2026, surveying 6,000 CEOs and executives, found that the vast majority see little operational impact from AI, with average usage amounting to about 1.5 hours per week . Deutsche Bank analysts warned that "AI redundancy washing will be a significant feature of 2026," noting that companies are attributing layoffs to AI that actually stem from other business pressures .
Against this, the physical constraints are real and measurable. HBM is sold out. CoWoS packaging capacity is booked. PJM capacity prices have increased tenfold. These are not narrative constructs — they are market prices and supply chain data. The scarcity may serve corporate interests, but it also exists independently of those interests.
The Energy Source Scramble
Meeting projected AI power demand requires not just more electricity, but the right kind delivered on the right timeline. In 2024, natural gas supplied 40% of data center electricity, with renewables providing 24% . The IEA projects that natural gas and coal together will meet over 40% of additional data center electricity demand through 2030, with renewables accounting for 40% of new capacity built to support the sector .
Nuclear power has attracted significant attention from tech companies. Microsoft, Google, and Amazon have collectively signed contracts for more than 10 GW of possible new nuclear capacity . Goldman Sachs Research projects that three nuclear plants could be brought online by 2030 . But the economics are challenging: nuclear costs between $6,417 and $12,681 per kilowatt of capacity, compared to $1,290 per kilowatt for natural gas . Construction timelines for new nuclear facilities stretch to a decade or more.
Small modular reactors (SMRs) have been proposed as a faster alternative, but none have yet reached commercial operation at scale. The practical reality is that natural gas will carry the near-term load, renewables will grow steadily, and nuclear will remain a small fraction of the AI power mix through the end of the decade.
What Comes Next
The AI infrastructure shortage is not a single crisis but a cascade of interlocking constraints — each one amplifying the others. Chip shortages raise prices for compute. Compute shortages increase demand for power. Power shortages slow the construction of new data centers. And the capital required to break through these bottlenecks is being concentrated in fewer and fewer hands.
Whether this resolves through market forces, government intervention, or some combination depends on decisions being made now — in state legislatures weighing ratepayer protections, in boardrooms allocating capital budgets, and in foundry cleanrooms trying to push the limits of physics. The stakes extend well beyond the AI industry. The same power grids that serve data centers also serve hospitals, schools, and homes. The same semiconductor supply chains feed automotive, medical device, and defense applications. The allocation choices made in the next five years will shape not just who gets to build AI, but who gets left behind.
Related Stories
Nebius Pursues $3.75B Raise After Meta and NVIDIA Partnerships
Nebius Launches $3.75 Billion Debt Offering Following Meta Partnership
South Korea Bull Market Faces Energy Shock Test
Micron Stock Hits Record on AI Memory Chip Demand
Micron Stock Hits Record on AI Demand and Taiwan Expansion
Sources (24)
- [1]Larry Fink Warns of Structural Compute Scarcity as AI Demand Outpaces Global Supplyblockonomi.com
BlackRock CEO Larry Fink stated 'We're short power, we're short compute, we're short chips,' and predicted compute futures as a new asset class.
- [2]Energy demand from AI – Energy and AI – Analysisiea.org
Electricity consumption from data centres amounted to around 415 TWh in 2024, about 1.5% of global electricity consumption, growing at 15% per year.
- [3]What we know about energy use at U.S. data centers amid the AI boompewresearch.org
U.S. data centers consumed 183 TWh of electricity in 2024, accounting for more than 4% of total electricity consumption.
- [4]Gartner Says Electricity Demand for Data Centers to Grow 16% in 2025 and Double by 2030gartner.com
Worldwide data center electricity consumption projected to rise from 448 TWh in 2025 to 980 TWh by 2030.
- [5]How Can We Meet AI's Insatiable Demand for Compute Power?bain.com
AI data center demand reached 21 GW in 2025, could reach 68 GW by 2027 and 327 GW by 2030. Power supply may be the most challenging constraint.
- [6]Energy Markets Race to Solve the AI Power Bottleneckmorganstanley.com
75-100 GW of new electricity generating capacity needed for digital demands by early 2030s. Hyperscalers could spend $1T+ in 2025-26.
- [7]Advanced packaging and HBM, not logic dies, were the bottlenecks on AI chip production in 2025epoch.ai
The four largest AI chip designers consumed over 90% of global CoWoS packaging capacity and HBM supply but only 12% of advanced logic production.
- [8]The AI Memory Supercycleintrol.com
HBM capacity sold out through 2026 across all major suppliers. SK Hynix CFO confirmed entire 2026 supply is sold. AI demand growing at 80% vs 16% supply growth.
- [9]A deeper look at the tightened chipmaking supply chain in 2026tomshardware.com
Nobody's scaling up, says analyst as industry remains conservative on capacity despite surging demand.
- [10]Inside the AI Bottleneck: CoWoS, HBM, and 2-3nm Capacity Constraints Through 2027fusionww.com
Advanced packaging and HBM constraints are structural limits that will shape pricing, lead times, and availability well into 2027.
- [11]BlackRock's Larry Fink Says AI Is Creating a New Trillion Dollar Asset Classfinance.yahoo.com
Fink envisions compute futures as a tradeable commodity. Microsoft, Amazon, Alphabet, and Meta expected to spend $710B+ combined in 2026 CapEx.
- [12]Nebius Group (NBIS): The Rise of the AI Neocloud Powerhousefinancialcontent.com
Nebius reported $529.8M in FY2025 revenue (479% YoY growth). Received $2B NVIDIA investment and $27B Meta deal for AI infrastructure.
- [13]Projected data center growth spurs PJM capacity prices by factor of 10ieefa.org
PJM capacity prices surged from $28.92/MW-day to $329.17/MW-day. Data centers responsible for 63% of the price increase.
- [14]PJM $100B Rate Shock: Data Centers vs Ratepayersintrol.com
Data centers added $9.3B in capacity costs absorbed by 67 million ratepayers. D.C. residential bills up $21/month, Ohio up $16/month.
- [15]How Data Centers Are Endangering State Budgetsgoodjobsfirst.org
Texas losing $1B in FY2025 to data center subsidies. Virginia's sales-tax exemption cost $1.6B.
- [16]State Data Center Legislation in 2026 Tackles Energy and Tax Issuesmultistate.us
27 states advancing legislation requiring data center developers to cover energy costs. White House Ratepayer Protection Pledge signed March 4, 2026.
- [17]Samsung vs. TSMC vs. Intel: Who's Winning the Foundry Market?patentpc.com
TSMC holds 72% foundry market share. Samsung signed $16.5B Tesla foundry deal. Intel's 18A shows promise but faces operating losses.
- [18]Apple considers Intel and Samsung to diversify chip manufacturing away from TSMC9to5mac.com
Apple exploring early-stage discussions with Intel and Samsung for M-series chip manufacturing diversification.
- [19]Compute Forecast — AI 2027ai-2027.com
Rest of US compute share falls from 18% to 9% as OpenAI, Google, and Meta consolidate resources.
- [20]AI for scientific discovery costs in 2026: platform licensing, compute, and integration economicssustainableatlas.org
Single AI drug discovery campaign costs $2-10M in compute. Materials science screening runs $500K-$3M. NSF allocated $140M for AI-for-science in FY2025.
- [21]Thousands of CEOs admit AI had no impact on employment or productivityfortune.com
NBER study of 6,000 CEOs found most see little AI impact on operations, with usage averaging 1.5 hours per week.
- [22]AI shortages to define 2026 for marketsprocurementpro.com
Deutsche Bank analysts warned 'AI redundancy washing will be a significant feature of 2026' as companies blame AI for cuts with other causes.
- [23]Data centre electricity use surged in 2025iea.org
Natural gas supplied 40% of data center electricity in 2024, renewables 24%. Gas and coal expected to meet over 40% of additional demand through 2030.
- [24]Is nuclear energy the answer to AI data centers' power consumption?goldmansachs.com
Tech companies signed contracts for 10+ GW of new nuclear capacity. Nuclear costs $6,417-$12,681/kW vs $1,290/kW for gas. Three plants could come online by 2030.
Sign in to dig deeper into this story
Sign In