Mamba, Mamba-2 and Post-Transformer Architectures for Generative AI with Albert Gu - 693
1:19:36
Building Real-World LLM Products with Fine-Tuning and More with Hamel Husain - 694
55:41
Language Understanding and LLMs with Christopher Manning - 686
31:51
MAMBA from Scratch: Neural Nets Better and Faster than Transformers
1:27
Pie petal plot in python or R
1:03:40
State Space Models w/ Albert Gu & Karan Goel (Cartesia AI)
39:24
AI Snake Oil—A New Book by 2 Princeton University Computer Scientists
57:45
Visualizing transformers and attention | Talk for TNG Big Tech Day '24
26:10