Deep Learning with PyTorch - Gradient Descent, Mini-Batch GD and SGD
Gradient Descent, Mini-Batch GD, SGD (Deep Learning with PyTorch) | Full Deep Learning Tutorial - Beginner to advanced. Complete "Deep Learning with PyTorch" Playlist : https://www.youtube.com/playlist?list=PLz6pthWWCdfRMjkzgzOVJZqVhvLk1gj9Q Videos are uploaded in a timely manner. 🚀 Key features of this series : - Watch live hands-on tutorials on YouTube - Train models using Google Colab on GCP for free! - Build an end-to-end real-world course project 2.2. Gradient Descent, Mini-Batch and Stochastic Gradient Descent In this tutorial, we understand Gradient descent and variants of GD conceptually and mathematically. 🎯 Topics covered in this video: ⌨️ Gradient Descent algorithm - Overview. ⌨️ Steps involved in Gradient descent algorithm - to learn model parameters. ⌨️ Types of Gradient descent. How SGD and Mini-Batch GD are different from Batch GD/GD. ⌨️ How to escape local minima during optimization process. ⌨️ Other optimizers which implement adaptive learning rate. ⌨️ Conclusion. Time Breaks: 00:00 Gradient Descent algorithm - Overview. 03:00 Gradient Descent algorithm Steps. 09:05 Variants of GD - Batch GD, SGD and Mini-Batch. 13:59 Noisy parameter updates, Other optimizers. -- Learn Data Science the right way at https://www.simplifiedailabs.com/ Subscribe for new videos on AI and ML @SimplifiedAICourse #deeplearning #pytorch #machinelearning #datascience #datasciencecourse #SimplifiedAICourse
Download
0 formatsNo download links available.