Deep Learning DIY by Marc Lelarge https://twitter.com/marc_lelarge
- notebook: https://dataflowr.github.io/website/modules/2b-automatic-differentiation/
0:00 Recap
0:40 A simple example (more in the practicals)
3:44 Pytorch tensor: requires_grad field
6:44 Pytorch backward function
9:05 The chain rule on our example
16:00 Linear regression
18:00 Gradient descent with numpy...
27:30 ... with pytorch tensors
31:30 Using autograd
34:35 Using a neural network (linear layer)
39:50 Using a pytorch optimizer
44:00 Backprop algorithm: how automatic differentiation works.
- Pytorch tensors (part 1): https://youtu.be/BmAS8IH7n3c
- full course: https://www.dataflowr.com/