Back to Browse

Gradient Descent Optimizer Vs Stochastic Gradient Descent Optimizer | Optimizer in Deep Learning

306 views
Jul 21, 2020
39:18

Explained about Gradient Descent optimizer, Stochastic gradient Descent optimizer and Mini Batch Stochastic Descent Optimizer in details with proper example and related Maths. Stochastic gradient descent (SGD) calculates the gradient (direction) using a single sample. Means it calculate loss on the basis of single record and then update the weight based on that loss. Due to its stochastic nature, the path towards the global minima is not "direct" as in GD, but may go "zig-zag".. Mini-Batch Gradient Descent combines the best of both to converge faster with less computational overhead. In this video, I'll talk about all the 3 Optimizers( Gradient Descent Optimizer, SGD and Mini batch SGD) so that the concept is clear. If you do have any questions with what we covered in this video then feel free to ask in the comment section below & I'll do my best to answer your queries. Please consider clicking the SUBSCRIBE button to be notified for future videos & thank you all for watching. Channel: https://www.youtube.com/channel/UCgHDngFV50KmbqF_6-K8XhA Support my channel πŸ™ by LIKE ,SHARE & SUBSCRIBE Check the complete Machine Learning Playlist : https://www.youtube.com/playlist?list=PLv8Cp2NvcY8CoxylKNIYBd9ZVQ1SlFWQ3 Check the complete Deep Learning Playlist : https://www.youtube.com/playlist?list=PLv8Cp2NvcY8CaSVfCIyg5mvek8JvaD7tE Subscribe my channel: https://www.youtube.com/channel/UCgHDngFV50KmbqF_6-K8XhA Support my channel πŸ™πŸ™πŸ™πŸ™πŸ™πŸ™πŸ™πŸ™ by LIKE ,SHARE & SUBSCRIBE Contact: [email protected] #gradientdescent #Batchgradientdescent #Stochasticgradientdescent #PifordTechnologies #AI #ArtificialIntelligence #DeepLearning

Download

1 formats

Video Formats

360pmp451.7 MB

Right-click 'Download' and select 'Save Link As' if the file opens in a new tab.

Gradient Descent Optimizer Vs Stochastic Gradient Descent Optimizer | Optimizer in Deep Learning | NatokHD