TMTB Morning wrap: Part 1
QQQs +45bps. Getting some thoughts out on MSFT and META early and will continue to jam on everything else and send out Part 2 a bit later…
In terms of impact on semis, investors should leave MSFT/META feeling a bit more positive on spend (at least for a day, ha!). We also got news of Softbank investing as much as $25B into OpenAI. Also helping semis for the day: NOW’s miss - while doesn’t seem demand drive - will potentially help shift some flows back to semis vs. software for the day + CLS also had a pretty nice beat.
MSFT guided Q3 and Q4 capex to remain at similar levels from Q2 ($22.6B or >$90B run rate) and expects FY26 capex to grow, albeit at a slower pace than FY25 (~55%). Nadella described the shift in capex from infra to more “short-lived assets more correlated to revenue growth” which reads good for semis like NVDA as that likely means GPUs and servers. Nadella didn’t seem concerned around AI scaling law constraints saying MSFT has been seeing >10x improvements in every model generation and mgmt feels training & inference cost efficiencies are a net positive that will accelerate # of new AI apps written & queries consumer and drive exponentially more demand.
As expected, Zuck sounded bullish on the call saying they will continue to invest “hundreds of billions of dollars” over the long-term (Good for ASICS - MRVL/AVGO +4% early…side note: one thing favoriting ASICs vs NVDA NT is that the White House Export controls affect NVDA much more given ASICs mainly selling to big hyperscalers). On Deepseek, Zuck reiterated the benefits of open source LLMs, noted that costs can be driven down while acknowledging that Deepseek has made a number of advancements in tech and META was still digesting them to see how to implement them into LLama. He talked up 2025 as a pivotal year for them and expects Llama 4 to become most advanced and most used model this year.
Keep reading with a 7-day free trial
Subscribe to TMT Breakout to keep reading this post and get 7 days of free access to the full post archives.