Back to Browse

L6.2 Understanding Automatic Differentiation via Computation Graphs

16.0K views
Feb 16, 2021
22:48

Sebastian's books: https://sebastianraschka.com/books/ As previously mentioned, PyTorch can compute gradients automatically for us. In order to do that, it tracks computations via a computation graph, and then when it is time to compute the gradient, it moves backward along the computation graph. Actually, computations graphs are also a helpful concept for learning how differentiation (computing partial derivatives and gradients) work, which is what we are doing in this video. Slides: https://sebastianraschka.com/pdf/lecture-notes/stat453ss21/L06_pytorch_slides.pdf ------- This video is part of my Introduction of Deep Learning course. Next video: https://youtu.be/VvUz0Q9e09g The complete playlist: https://www.youtube.com/playlist?list=PLTKMiZHVd_2KJtIXOW0zFhFfBaJJilH51 A handy overview page with links to the materials: https://sebastianraschka.com/blog/2021/dl-course.html ------- If you want to be notified about future videos, please consider subscribing to my channel: https://youtube.com/c/SebastianRaschka

Download

1 formats

Video Formats

360pmp427.7 MB

Right-click 'Download' and select 'Save Link As' if the file opens in a new tab.

L6.2 Understanding Automatic Differentiation via Computation Graphs | NatokHD