Jeff Dean Opens Up on Google's AI Stack, TPUs, and Coding Agents
Google's Chief AI Scientist discusses Search evolution, sparse models, TPU co-design, and the push for efficient AI.
Google Chief AI Scientist Jeff Dean sat down for an extensive Q&A covering the full arc of his work — from rewriting Google's search infrastructure in the early 2000s to the company's current AI ambitions.
Dean discussed the revival of sparse trillion-parameter models, a technique that activates only portions of massive networks to balance raw performance with computational efficiency. It's a bet that brute-force scaling isn't the only path forward.
He also dove into Google's TPU strategy, emphasizing the co-design approach where hardware and models evolve together rather than in isolation. That tight feedback loop between silicon and software remains a core Google advantage.
Coding agents got airtime too. Dean addressed where autonomous programming tools are headed and how Google is positioning itself in a space getting crowded fast.
The throughline: efficiency matters as much as capability now.