Gradient descent - with a simple example
10:55
Newton's method - explained | Newton-Raphson method | find roots and minimum value
20:00
Stochastic gradient descent (SGD) vs mini-batch GD | iterations vs epochs - Explained
31:35
Bayesian statistics - the basics
15:38
Optimization in Machine Learning - Second order methods - Gauss-Newton
15:03
Multinomial logistic regression | softmax regression | explained
34:08
Recurrent neural network (RNN) - explained super simple
12:39
Exposing the SECRETS every polyglot knows but won’t tell you
14:09