Paper Reading & Discussion: Finding Skill Neurons in Pre-trained Transformer-based Language Models
46:38
Paper Reading & Discussion: Quantifying Memorization Across Neural Language Models
37:55
Paper Reading & Discussion: Knowledge Neurons in Pretrained Transformers
27:14
Transformers (how LLMs work) explained visually | DL5
2:13:37
Paper Reading & Discussion: Pre-train, Prompt, and Predict: Survey on Prompting Methods in NLP (All)
31:10
Paper Reading & Discussion: Emergent and Predictable Memorization in Large Language Models
1:23:52
【Clustering And Dimensionality Reduction】Week 21 - DenMune: Density peak based clustering
57:45
Visualizing transformers and attention | Talk for TNG Big Tech Day '24
23:51