AI's Real Bottleneck: Chips, Memory, and Power Are All Hitting Walls
SemiAnalysis CEO Dylan Patel breaks down the triple threat choking AI compute scaling.
The AI scaling dream is running headfirst into physics. In a wide-ranging interview on the Dwarkesh Podcast, SemiAnalysis CEO Dylan Patel laid out the three bottlenecks strangling AI compute growth: logic, memory, and power.
Nvidia isn't sitting still. The company locked down TSMC's N3 (3-nanometer) allocation early, securing its manufacturing pipeline while competitors scramble for capacity. It's a classic Nvidia move — controlling supply before demand peaks.
Here's a wild stat: an H100 GPU is worth more today than it was three years ago. In an industry where hardware depreciates fast, that's almost unheard of. It speaks to just how constrained AI compute supply remains relative to explosive demand.
Patel's broader point is clear. Throwing money at AI isn't enough when the physical infrastructure can't keep pace. The semiconductor supply chain is now AI's most critical dependency.