Adam is a very popular optimization techniques for deep learning models. Nesterov Accelerated Gradient has been shown to improve momentum optimizer. Let's combine both into NADAM!
## Table of Content:
- Introduction: 0:00
- Theory: 1:03
- Code: 6:00
- Conclusion: 13:11
## Credit:
- If you haven't already checkout this great blog article on the gradient descent techniques for optimization: https://ruder.io/optimizing-gradient-descent/index.html#nadam
- Music was taken from Youtube and is named A Brand New Start by TrackTribe, checkout their website they are doing some really cool non-profit stuff: https://tracktribe.com/
## Repository:
You can checkout the code over here, all of it is in the Gradient Descent notebook: https://github.com/yacineMahdid/artificial-intelligence-and-machine-learning
----
Join the Discord for general discussion: https://discord.gg/QpkxRbQBpf
----
Follow Me Online Here:
Twitter: https://twitter.com/CodeThisCodeTh1
GitHub: https://github.com/yacineMahdid
LinkedIn: https://www.linkedin.com/in/yacine-mahdid-809425163/
Instagram: https://www.instagram.com/yacine_mahdid/
___
Have a great week! 👋