REPA Representation Alignment for Generation: Training Diffusion Transformers Is Easier Than You ...

1:31:09
LLM Lecture: A Deep Dive into Transformers, Prompts, and Human Feedback

22:27
MAMBA and State Space Models explained | SSM explained

37:18
My PhD Journey in AI / ML (while doing YouTube on the side)

9:52
Training large language models to reason in a continuous latent space – COCONUT Paper explained

28:47
Transformer LLMs are Turing Complete after all !?

24:23
The Breakthrough Behind Modern AI Image Generators | Diffusion Models Part 1

9:01
We had Image Gen copying LLM... and now the REVERSE?? [DiffusionLM]

36:55