Back to Browse

NADAM Optimizer from Scratch in Python

2.4K views
Dec 21, 2020
13:35

Adam is a very popular optimization techniques for deep learning models. Nesterov Accelerated Gradient has been shown to improve momentum optimizer. Let's combine both into NADAM! ## Table of Content: - Introduction: 0:00 - Theory: 1:03 - Code: 6:00 - Conclusion: 13:11 ## Credit: - If you haven't already checkout this great blog article on the gradient descent techniques for optimization: https://ruder.io/optimizing-gradient-descent/index.html#nadam - Music was taken from Youtube and is named A Brand New Start by TrackTribe, checkout their website they are doing some really cool non-profit stuff: https://tracktribe.com/ ## Repository: You can checkout the code over here, all of it is in the Gradient Descent notebook: https://github.com/yacineMahdid/artificial-intelligence-and-machine-learning ---- Join the Discord for general discussion: https://discord.gg/QpkxRbQBpf ---- Follow Me Online Here: Twitter: https://twitter.com/CodeThisCodeTh1 GitHub: https://github.com/yacineMahdid LinkedIn: https://www.linkedin.com/in/yacine-mahdid-809425163/ Instagram: https://www.instagram.com/yacine_mahdid/ ___ Have a great week! 👋

Download

1 formats

Video Formats

360pmp423.7 MB

Right-click 'Download' and select 'Save Link As' if the file opens in a new tab.

NADAM Optimizer from Scratch in Python | NatokHD