Mistral 8x7B Part 1- So What is a Mixture of Experts Model?

6:11
Mistral 8x7B Part 2- Mixtral Updates

17:14
Desbloqueie a multimodalidade aberta com Phi-4

28:01
Understanding Mixture of Experts

11:48
'My jaw is dropped': Canadian official's interview stuns Amanpour

13:49
Tülu 3 from AI2: Full open-source fine-tuning recipe for LLMs

11:10
Qwen QwQ 32B - The Best Local Reasoning Model?

19:58
DeepSeek is a Game Changer for AI - Computerphile

1:26:21