Anthropic has hit a $30 billion annualized revenue run rate, surpassing OpenAI on that metric, while simultaneously closing a multi-gigawatt compute deal with Google and Broadcom. That scale of infrastructure commitment is not a footnote. It signals where the capital is concentrating and how long the runway needs to be.
Both Anthropic and OpenAI are carrying enormous model-training costs and using accounting methods that exclude training expenses to present near-term profitability. The piece is worth reading in full specifically for how those accounting choices obscure the real economics of frontier AI development.
On the product side: Google is commercializing Gemma 4 through an on-device dictation app, Meta is preparing a partly proprietary model release, and internal token-maxing practices at major labs are visibly reshaping engineering culture. The shift from open to partly closed at Meta is the thread worth pulling.
[WATCH ON YOUTUBE →]