Week 7 - Lab 2 (Vision Transformers - ViT)
![](https://i.ytimg.com/vi/j3VNqtJUoz0/mqdefault.jpg)
16:51
Vision Transformer Quick Guide - Theory and Code in (almost) 15 min
![](https://i.ytimg.com/vi/7Ens3u77_js/mqdefault.jpg)
38:41
Week 7 - Lab 1 (Transformers)
![](https://i.ytimg.com/vi/3GBPOcizp2Q/mqdefault.jpg)
31:30
[EfficientML] Janek Haberer - HydraViT: Stacking Heads for a Scalable ViT
![](https://i.ytimg.com/vi/o_9VlKQRn20/mqdefault.jpg)
13:16
Week 6 - Lab 1 (Attention Mechanism)
![](https://i.ytimg.com/vi/D7jiZDYnM3I/mqdefault.jpg)
42:36
Low-Overhead Parallelisation of Linear Combination of Unitaries: Gregory Boyd
![](https://i.ytimg.com/vi/l4KitGnDXxo/mqdefault.jpg)
15:43
Vision Transformers - The big picture of how and why it works so well.
![](https://i.ytimg.com/vi/5dnVH7jCZKQ/mqdefault.jpg)
10:36
Dieter Nuhr GENIALE Wahlempfehlung 📢 So PEINLICH ist die Politik 🤡
![](https://i.ytimg.com/vi/a05bKmz7Xrs/mqdefault.jpg)
33:10