TMTB: Gavin Baker ILTB Podcast Key Quotes
Full link here…Lot of good stuff here so including a lot of the quotes
1. Token costs, low-cost producers, and Google vs. Nvidia/Blackwell/GB300
“Google for sure has this temporary advantage right now from a pre-training perspective. I think it’s also important that they’ve been the lowest cost producer of tokens.”
“AI is the first time in my career as a tech investor that being the low-cost producer has ever mattered. Apple is not worth trillions because they’re the low-cost producer of phones. Microsoft is not worth trillions because they’re the low-cost producer of software. Nvidia is not worth trillions ‘cause they’re the low-cost producer of AI accelerators. It’s never mattered.”
“What Google has been doing as the low-cost producer is they have been, I would say, sucking the economic oxygen out of the AI ecosystem which is an extremely rational strategy for them… let’s make life really hard for our competitors.”
“The GB300 is a great chip. It is drop-in compatible in every way with those GB200 racks… You’re going to put those GB300s in and then the companies that use the GB300s, they are going to be the low-cost producer of tokens, particularly if you’re vertically integrated.”
“If you have a decisive cost advantage and you’re Google and you have search and all these other businesses, why not run AI at a negative 30% margin? It is by far the rational decision… You take the economic oxygen out of the environment… and then on the other side of that maybe have an extremely dominant share position.”
“All that calculus changes once Google is no longer the low-cost producer, which I think will be the case… and I do think it’s very interesting like the strategic and economic calculations between the players. I’ve never seen anything like it.”
2. Scaling laws, reasoning, Gemini 3, and the trajectory of AI progress
“I do think Gemini 3 was very important because it showed us that scaling laws for pre-training are intact. They stated that unequivocally and that’s important because no one on planet Earth knows how or why scaling laws for pre-training work.”
Keep reading with a 7-day free trial
Subscribe to TMT Breakout to keep reading this post and get 7 days of free access to the full post archives.


