Fourteen posts map a day of scale, tools, policy, and culture. Epoch data shows the US at 68 percent of global AI compute, with China spending aggressively yet constrained by export rules and a limited opening for Nvidia H20 and AMD MI308, keeping the US advantage. Dwarkesh Patel revisits Richard Sutton, framing LLM pretraining as a useful prior that can bootstrap reinforcement learning and evolve toward continual learning.
On products, Perplexity announces Comet Plus with major publishers for Pro and Max, Grok Expert now inlines images, Airweave ships an open source bi temporal knowledge base for real time agents, and Opencode with GLM 4.6 offers a low cost coding TUI via Hugging Face.
Risk and data quality stay in focus as Chamath warns that SEO gaming can poison training sets, while a thread on TrueType shows how a font VM enables both exploits and playful projects like Fontemon and llama.ttf. Marc Andreessen points to sustained high rates alongside a cooler job market. OpenAI’s $6.6B secondary prompts housing chatter. A fatal shooting tied to public drug use reignites policy debate. Text to video can now spin up full cartoon episodes, stirring copyright concerns. Crypto adds $130B, and Patrick Collison spotlights the Woolworth Building’s history.