Paper Reading & Discussion: Knowledge Neurons in Pretrained Transformers
40:39
Paper Reading & Discussion: Finding Skill Neurons in Pre-trained Transformer-based Language Models
38:46
Trying to understand PiPPy and Pipeline Parallelism API in PyTorch
24:06
Paper Reading & Discussion: Pre-train, Prompt, and Predict: Survey on Prompting Methods in NLP (P4)
15:51
Adapters | Simple, Scalable Adaptation for Neural Machine Translation | PEFT Methods
36:14
Paper Reading & Discussion: LoRAMoE: Alleviate World Know. Forgetting in LLMs via MoE-Style Plugin
14:16
Adapters | Exploring VGLM Via Parameter-Efficient Transfer Learning | PEFT Methods
35:19
MUHAMMAD FADIL (21129249)_TUGAS MEMBACA ARTIKEL BERBAHASA INGGRIS 2
40:46